Game Piece Position Estimation - Step by step guide¶
Welcome! This guide walks you through setting up the project, generating images with Blender, processing data, and using the pose estimation algorithm to predict a game piece’s position relative to a robot.
Prerequisites¶
- Operating system: Windows, macOS, or Linux
- Python: 3.10+ (3.11 recommended)
- Blender: 4.5 or newer (older versions may fail to open the provided
.blend
file) - Git (optional, for cloning)
Python installation
If Python isn't installed, please install it first, you can use this guide: Python installation guide.
After installation, ensure python
/pip
are available from your terminal.
Download the project¶
- Click Code → Download ZIP on the repository page.
- After the download completes, extract the ZIP to your desired folder.
Program setup (Python)¶
Open a terminal/command prompt and navigate to the project folder you cloned/extracted in the previous step.
(Recommended) Create a virtual environment¶
Install required packages¶
From the project root:
Blender¶
- Download and install Blender from: blender.org
- Open the
template.blend
file included in this repository.
Warning
If the file doesn't open, ensure you're using Blender 4.5 or newer. Older versions may not be compatible.
Setup Blender-Python packages¶
- In Blender, click Windows → Toggle System Console so you can see script output.
- Navigate to the Scripting tab.
- In the script drop-down, select
Install requirements
. - Click Run and verify there are no errors in the console.
Blender setup & Image Generation¶
4.1 Capture reference images¶
Connect your robot's camera to your computer.
Run the image_capture.py
script and capture a few photos with the game piece clearly in frame. These will be used to calibrate the camera.
Tip
Take these photos on a floor with a tile grid pattern (e.g., a classroom floor). Aligning Blender’s grid to a real grid makes calibration much easier.
4.2 Calibrate the camera in Blender¶
- Open
template.blend
. - Measure your robot’s camera height above the ground and its rotation.
- Click the camera icon (in the viewport) to view through the camera.
- Select the camera in the Outliner.
- Open Object Data Properties → Background Images, enable it, and load one of the photos captured in the last step.
- Adjust position and rotation under Object Properties.
- Adjust Field of View and Sensor Size under Object Data Properties.
- Align the camera so Blender’s grid matches the floor’s grid in your background image. You can also modify Blender’s grid size to match the physical tile size on your floor.
Optional: fSpy for a head start
You can use fSpy to get a close initial camera estimate. Helpful video tutorials:
4.3 Import the game piece model¶
- Download the official game piece model (e.g., from FIRST’s website) and export/save it as
.stl
from SolidWorks. - In Blender: File → Import → STL, select the saved model.
- After import, you will likely need to scale by 0.001 (1/1000) because SolidWorks uses millimeters and Blender uses meters.
- With the object selected, open Object Properties → Scale and set the scale.
- Press
F3
, search forOrigin to Geometry
, and press Enter to center the origin.
4.4 Define the working area with FOV_Plane
¶
- Open the Camera Plane Set Up tab.
- Click the camera icon (in the viewport) to switch to the camera’s view.
- As you can see the viewport is split so you have a left (camera view) and right (top-down) view.
- In the left view, use Move (from the top toolbar). In the right/top view:
- Select the plane object named
FOV_Plane
. - Press
Tab
to enter Edit Mode. - Select vertices and move them along X and Y to match the camera FOV visible in the left view.
- Select the plane object named
Tip
Use s
→ x
on your keyboard to scale two or more vertices on the x axis.
4.5 Run the main Blender script¶
- Select the imported game piece, go to Material Properties and select the
Red
material - Windows → Toggle System Console (if not already open) to view outputs.
- Go to the Scripting tab and select the
main
file from the dropdown. - Review configuration variables at the top of the script and adjust as needed.
- Run the script and wait for rendering to complete.
Process Blender data¶
After rendering finishes, open the ProcessBlenderData.ipynb
Jupyter notebook and run all cells.
Output: a CSV
containing all possible game piece positions and their bounding rectangles in the generated images.
Estimate the position¶
Open the PoseEstimation.ipynb
notebook to accurately predict the game piece’s position relative to the robot given the bounding rectangle in an image.
Note
For a production-ready example, see the RaspberryPiCode/
folder for integrating the algorithm on-device.
Troubleshooting¶
Blender file won’t open
Ensure you’re using Blender 4.4+. Update Blender if necessary.
pip install -r requirements.txt
fails
- Activate your virtual environment first.
- Upgrade pip:
python -m pip install --upgrade pip
- On macOS/Linux, try
pip3
instead ofpip
if multiple Python versions exist.
Background image won’t show in camera view
- Confirm you enabled Background Images in Object Data Properties for the camera.
- Check the opacity and that you’re actually viewing through the camera.
Imported STL is too big/small
- SolidWorks uses mm, Blender uses m. Scale by 0.001 or 1000 accordingly.
- Apply scale if needed (
Ctrl+A → Scale
).
main
script can’t find paths/files
- Confirm paths are correct and files exist in the expected folders.
- Run Blender from the project root or adjust path handling in the script.
Attributions¶
- Blender® is a registered trademark of the Blender Foundation.
- fSpy is a community project for camera matching.
- FIRST® references for the game piece model.