Skip to content

Releases: ubcaerodesign/codex

Minnie, Composite Image Maker, Ground Truth Data

28 Mar 08:19

Choose a tag to compare

So much good stuff in this release.

Changes/new features

  1. OpenCV stitcher + PyQT for centroid selection and full PyQT manual composite image maker by @jingchusu, @kevinlinxc, @mxrah10
  2. Minnie, our map for showing composite images overlayed on a map by @evinli
  3. Push ground truth dataset through the system instead of old fake data by @kevinlinxc
  4. Gamer GUI style updated by @amandayyang15
  5. Reading serial data from serial radio by @kevinlinxc
  6. ubcaerodesign python package for reused helper functions by @midorashiu
  7. Yolov8 training, model integration (but too slow for now) by @mannhingrajia

How to run the system

Follow the README instructions for setting up the poetry environment, Docker.

  1. Activate the Poetry environment (source .venv/bin/activate on Mac or .venv\Scripts\activate on Windows)

  2. Run docker compose up -d --build
    to run Redis, Minnie, and Icarus. In a browser, open localhost:3000 for Icarus and localhost:3001 for Minnie.

  3. In the same terminal, run python gamer-gui/main.py --test. If it's your first time running it, it'll yell at you about not having ground truth data images, follow the error message to download the images.

This will start gamer GUI in test mode, publishing images and flight data using data from the ground truth dataset into the system. On Gamer GUI, you should see images appearing, and on Icarus, you should see altitude appearing.

On Gamer GUI, look for two images that have the same target. Click on one at a time, click on the centroid in the main area (c to disable centroid snapping), press s to save to the staging area. Once you have two images in the staging center, press g.

This will try to run the OpenCV stitcher, if that succeeds then you just have to click the centroid in PyQT and press enter (if the centers look bad, you can press esc to go to the custom stitcher).
If OpenCV stitcher fails, then you have to use the custom stitcher to create a good composite image. The controls are in the text at the top of it. Enter to confirm.

After confirming on either stitcher, the GraphX algorithm is run, finding the GPS coordinates of the corners of the image, and the centroid's GPS. This is bundled and sent to Minnie for verification (you should see an image on Minnie).

If it looks good, you can load the target in Gamer GUI and send it to the PADA.

Running locally

To avoid sending images to the Digital Ocean server, you can run the server locally.

The easiest way to do this is just by doing
python test.py --local --test.

This runs the local docker compose file, and then python gamer gui in --local --test mode. It also opens the browser for you automatically.

v0.0.5 Test.py

19 Nov 07:57
f103092

Choose a tag to compare

To run the whole system, set up and run Docker and the Poetry environment following the README, and then run python test.py to run everything.

Big changes since last release, some highlights:

  1. Run the whole system with python test.py or python test.py --local. This makes testing the system a lot simpler. At competition, plane will be running separately, but this method is good for testing purposes.
  • python test.py runs plane, our backend (Redis + Websocket server + Mercury + Icarus), and Gamer GUI all automatically.
  • python test.py --local does all of the above, but also runs the telemetry ZMQ-echoing server locally. Plane sends to the local server, which sends to mercury etc. This lets multiple test at once, which was not possible with the Digital Ocean remote server.
  1. Icarus is now in React, and looks great. We send data from Mercury to a websocket server which the React app reads from and serves the frontend at localhost:3000. This gets opened automatically by test.py.
  2. Lots of code has been refactored in Plane, Mercury and Gamer GUI.
  3. Lots of quality-of-life updates

Plane.py put into Docker

05 Sep 02:32

Choose a tag to compare

This release is the closest thing we have to having the whole 2023 comp system in Docker as of right now. It can be used to demo the system fairly easily, with the exception of Gamer GUI.

Install Docker Desktop.

Run docker compose up --build -d to run:

  • Plane
  • Mercury
  • Redis
  • Icarus
  • Atlas

Just by running this, the plane should be sending altitude data to Icarus and flight path data to Atlas. You can see those by going to localhost:8502 and localhost:8501 respectively.

Unfortunately, it seems like Gamer GUI only works on MacOS for some reason. If you try to run Gamer GUI. (poetry install, python gamer-gui/main.py), it will only be able to communicate with the rest of the system on MacOS, and nothing will happen on Windows. You can read the previous release (v0.0.3) to run the whole system manually.

Docker 2023 Comp Setup

09 Jul 04:45

Choose a tag to compare

In this release, you can run four things with Docker instead of through some other means:

  1. Redis
  2. Mercury
  3. Atlas
  4. Icarus

To run this snapshot then, first install Docker Desktop for your system. Run Docker Desktop, which will start the docker engine on your computer. You'll know it's ready when docker info in a terminal returns some information.

Then, do the following:

  1. poetry install as always if you haven't, to install the dependencies for gamer-gui and plane
  2. Activate the virtual environment (and do so with every new terminal if needed)
  3. If you get a 'wsl kernel needs updating' error when launching the desktop app, run wsl --update in command prompt and everything should work
  4. Run docker compose up in the codex directory to start the above four services. Open localhost:8501 and 8502 in a browser to see Icarus and Atlas. Optionally, Redis Insight can be viewed at localhost:8001.
  5. Run python telemetry/server.py on the server (instructions here), which is currently at 144.126.213.3. Change this address everywhere it appears in the code if a new server is being used.
  6. In another terminal locally, run python gamer-gui/main.py to start Gamer Gui.
  7. Finally, open a new terminal and run python plane/plane.py to start the fake plane script. This will send data to the server, which will send data to mercury in docker, which will send data to Redis in docker, and you should be able to see GPS data on Atlas and Altitude on Icarus. On gamer GUI, set the plane state mode to Collect, and fake images should start appearing. Create prediction points with gamer-gui using the controls in the top left, or use a grid prediction point pre-loaded into Redis by Atlas, and tell plane.py to drop the PADA at the prediction point.

Video demonstration:
https://youtu.be/CSTwVpcAn28

The reason plane.py isn't put in Docker is because it would normally be running on the plane. Gamer GUI could technically be put in Docker but I find the UI going through Docker to be kind of funky and not elegant. Gamer GUI, Icarus, and Atlas might all be replaced soon, so this snapshot is less relevant, although it is useful if we want to go back to using the grid method.

Running 2023 Comp Setup Without Docker

09 Jul 00:13

Choose a tag to compare

This release will describe how to run all the architecture that was used in the 2023 design. Here is a diagram of how the system fits together:
image

  1. Set up the poetry environment with poetry install and activate the virtual environment if not already activated
  2. Install and run Redis (Memurai if on Windows)
  3. SSH into the server (currently root@144.126.213.3, change instances of the address if a new server is being used), do the same installation with the codex repo and run python telemtry/server.py. This starts two ZMQ channels that pass through things from the ground to the "plane".
  4. Run mercury with python mercury/main.py
  5. Run atlas with streamlit run atlas/atlas.py. Open a browser terminal to http://localhost:8502/ if it doesn't open automatically.
  6. Run icarus with streamlit run atlas/icarus.py. Same as before, but localhost:8501
  7. Run gamer-gui with python gamer-gui/main.py. This should open up a PyQT UI.
  8. Run plane.py, the fake plane with python plane/plane.py. As we are in preflight mode and switch to idle mode, it should start sending fake data. This should make print statement appear in the ssh terminal where server.py is running. Mercury should also be showing data, random altitudes should be showing up on icarus, and fake GPS data should be showing on atlas. If nothing is happening, try restarting mercury.
  9. On gamer-gui, set the state to "collect" and send the state. This should be sent through the system, and fake images should start appearing in gamer-gui. From here, you can select centroids of targets in the main console of gamer GUI, save them to bins, and run the X algorithm on bins. Target predictions generated this way are stored with the prefix "g" in Redis. There are also hardcoded targets made by Atlas on startup, which are displayed in a grid on Atlas. These have the prefix "h". You can send any of these predictions to the plane with gamer-gui on with the controls, and then tell the plane to drop using gamer-gui as well. The grid of targets is based on the idea that we can just eyeball where the targets are based on the camera footage better than we can trust the X algorithm at the current moment.

Note that you can also install RedisInsight to see what is happening in Redis.