diff --git a/404.md b/404.md index 392573d811..52fea8faf6 100644 --- a/404.md +++ b/404.md @@ -3,4 +3,4 @@ title: 404 layout: 404 --- -The page you're trying to load was not found \ No newline at end of file +The page you're trying to load was not found. \ No newline at end of file diff --git a/README.md b/README.md index f38f20d8c6..3183a89fb5 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,8 @@ # John Lyle's Portfolio -## To Do List -* AugRE Work -* Drop Drone RL +A Jekyll-based portfolio website showcasing my engineering projects, experiences, and technical capabilities. + +## Current Projects +* Gaze-based multi-agent control research +* 3D printing with robotic arms +* Advanced battlebot development diff --git a/_config.yml b/_config.yml index 2a595dcd14..c54ed3112c 100644 --- a/_config.yml +++ b/_config.yml @@ -47,7 +47,7 @@ author-name: John Lyle # URL for the Image of the Author author-image: /assets/images/headshot.png # 60 Words About the Author -author-about: I am a fourth-year mechanical engineering student at UT Austin. I have experience in electronics, software, and mechanical design, and love the interdisciplinary nature of robotics. I am passionate about developing innovative R&D solutions to improve human lives and mitigate risks. +author-about: "I am a mechanical engineering master's student from UT Austin with experience in electronics, software, and mechanical design. I love the interdisciplinary nature of robotics and am passionate about developing innovative R&D solutions to improve human lives and mitigate risks." # URL to the Author's Profile (i.e., Github, Twitter, Stackoverflow, etc) author-url: https://www.linkedin.com/in/johnlyleiv @@ -99,3 +99,4 @@ exclude: - vendor/ruby/ repository: jbliv/jbliv.github.io + diff --git a/_includes/about.html b/_includes/about.html index d061f1e331..629b8332ca 100644 --- a/_includes/about.html +++ b/_includes/about.html @@ -15,12 +15,10 @@ Contact Me -
Projects -
diff --git a/_includes/contact.html b/_includes/contact.html index 355dafc2c3..bee7caaf0a 100644 --- a/_includes/contact.html +++ b/_includes/contact.html @@ -26,7 +26,7 @@ {% endunless %}

- You Can find me here + You can find me here

{% unless site.facebook_username == %} diff --git a/_includes/showcase.html b/_includes/showcase.html index 58af9b3409..0172591de6 100644 --- a/_includes/showcase.html +++ b/_includes/showcase.html @@ -10,7 +10,6 @@

About Me -

\ No newline at end of file diff --git a/_posts/2023-01-10-optimal-schedule-algorithm.md b/_posts/2023-01-10-optimal-schedule-algorithm.md index 5a8407c466..421a1a737a 100644 --- a/_posts/2023-01-10-optimal-schedule-algorithm.md +++ b/_posts/2023-01-10-optimal-schedule-algorithm.md @@ -15,7 +15,7 @@ tags: **Problem Statement:** Automate the schedule creation process for a snack shop while accounting for current metrics used in manual schedule creation. -**Project Details:** This is a project I did of my own volition during winter break of my sophomore year. It was made using python and the openpyxl package. It is only partially completed and does not have the final touches I wish it did. Although the code is not the prettiest and is missing some features I am still proud of the underlying algorithm which this post will focus heavily on. I took user availability and working time preferences from an excel sheet which was read by the program and then an optimal schedule was created and formatted into another excel sheet for printing and distribution. +**Project Details:** This is a project I did of my own volition during winter break of my sophomore year. It was made using Python and the openpyxl package. It is only partially completed and does not have the final touches I wish it did. Although the code is not the prettiest and is missing some features, I am still proud of the underlying algorithm which this post will focus heavily on. I took user availability and working time preferences from an Excel sheet which was read by the program, and then an optimal schedule was created and formatted into another Excel sheet for printing and distribution. **Project Background:** The schedule that this program creates is for a student org snack shop. There are 16 officers that each have 3 hours a week. The shop is open 8 hours a day 5 days a week. The image below shows an example of how one would fill out their availability. The red indicates unavailable, the white indicates available, the light green indicates preferred, and the dark green indicates ideal. The cover image in this post is the final result of the program. @@ -36,13 +36,13 @@ tags: 6. Least amount of light green. (Focus on ideal hours) 7. Least middle hour of 3 hour blocks. (When a person has 3 hours of straight availability try and pick the first or last hour) -**Class Based Structure:** This program utilizes object oriented program and python classes in order to effectively accomplish the goal. I created classes for both the officer and the schedule. This allowed me to store availability for each officer and the schedule created as well as it's metrics within the corresponding class. I was also able to create functions for each class such as ones for getting an officer's availability and for getting metrics from a schedule. +**Class Based Structure:** This program utilizes object-oriented programming and Python classes in order to effectively accomplish the goal. I created classes for both the officer and the schedule. This allowed me to store availability for each officer and the schedule created as well as its metrics within the corresponding class. I was also able to create functions for each class such as ones for getting an officer's availability and for getting metrics from a schedule. -**Core Algorithm:** The main algorithm behind this program uses a tree based structure with a combination of breadth and depth first searches. The root of the tree is an empty schedule and a child is created for each officer available within the first hour of the first day. This process is repeated for each hour under each child and added to the queue. Children are added to the queue with those representing an addition of an Ideal or Preferred hour first. +**Core Algorithm:** The main algorithm behind this program uses a tree-based structure with a combination of breadth and depth first searches. The root of the tree is an empty schedule and a child is created for each officer available within the first hour of the first day. This process is repeated for each hour under each child and added to the queue. Children are added to the queue with those representing an addition of an Ideal or Preferred hour first. Once the queue reaches a size of 10,000 the search is switched to a depth first search until it returns below 10,000. Due to the order of placement in the queue when it is treated as a stack schedules with more Ideal hours tend to appear first. When a complete possible schedule is found it is added to the list of solutions and it's metrics are recorded. From then on new schedules are only added if they are equal in metrics rankings. If a better schedule is found the list is reset as well as the comparison metrics. -If the program checked every possible schedule combination it would take days or weeks to run. In order to optimize the run time each branch has it's metrics tracked as new children are created. If the branch gets to a point where it will never have better rankings than the current best solution then the branch is removed and no further search will be done on it. This drastically decreases the amount of nodes searched and in turn decreases computing time. +If the program checked every possible schedule combination it would take days or weeks to run. In order to optimize the run time, each branch has its metrics tracked as new children are created. If the branch gets to a point where it will never have better rankings than the current best solution, then the branch is removed and no further search will be done on it. This drastically decreases the amount of nodes searched and in turn decreases computing time. During my testing the input data I used tended to search 340 million nodes and locate 1.6 million solutions in ~40 minutes of runtime before determining that the optimal solution had been found. diff --git a/_posts/2023-04-24-wings.md b/_posts/2023-04-24-wings.md index ced4a6f98a..1d5d858294 100644 --- a/_posts/2023-04-24-wings.md +++ b/_posts/2023-04-24-wings.md @@ -13,7 +13,7 @@ tags: **Problem Statement:** The primary goal of this project was to create a wireless noble gas sampler fleet that was both low cost and easily portable. This project was handed off to myself and a few other undergraduates from a previous group of undergrads to fix a few things with the design and then finish manufacturing the rest of the gas samplers. -**Project Background:** Underground nuclear weapons testing commonly releases a radioactive isotope of xenon as a byproduct of such tests. Since noble gases travel through soil into the atmosphere these gases is a great way to detect nuclear weapons tests. Current solutions for noble gas sampling require incredibly expensive machines that are nearly impossible to setup in remote locations. By creating a cheaper and portable solution for collecting samples we can instead ship these samples back to labs to be tested instead. Another motivation behind this program was to create improved atmospheric transport models for the gas studied. +**Project Background:** Underground nuclear weapons testing commonly releases a radioactive isotope of xenon as a byproduct of such tests. Since noble gases travel through soil into the atmosphere, these gases are a great way to detect nuclear weapons tests. Current solutions for noble gas sampling require incredibly expensive machines that are nearly impossible to set up in remote locations. By creating a cheaper and portable solution for collecting samples, we can instead ship these samples back to labs to be tested. Another motivation behind this program was to create improved atmospheric transport models for the gas studied. **Project Timeline:** I joined this group in May 2022 where we were tasked with adding a few finishing touches to the design and manufacturing 4 more of the sampling units. In my first few weeks on this project I spent a lot of time familiarizing myself with all aspects of the design and assembly process. By doing this I helped myself better understand how the gas sampler worked and more importantly I caught multiple key design flaws in the sampler that required drastic changes in order to remedy. After creating fixes for all of these flaws I assembled and documented the assembly procedure for the first of 5 samplers to be built. @@ -37,7 +37,7 @@ Once assembly was complete the samplers were used in an irradiated gas release a --- # Individual Contributions -**Design Flaw Identification and Fixing:** While acquainting myself with this project I found many flaws that would have greatly hindered the operating abilities of the gas samplers. Once I identified all of these flaws I then designed solutions for each of them. +**Design Flaw Identification and Fixing:** While acquainting myself with this project, I found many flaws that would have greatly hindered the operating abilities of the gas samplers. Once I identified all of these flaws, I then designed solutions for each of them. **1. Incorrect solenoid selection:** When examining the data sheet for the solenoids I noticed they were only capable of holding to the desired rated pressure of 135 psi in one direction. This would mean that samples would leak once the main line was depressurized. I selected a replacement component that was a normally closed globe valve that would automatically close in case of power loss to preserve the sample. This component would satisfy pressure requirements and was significantly cheaper however it took 5 seconds to completely actuate. @@ -55,7 +55,7 @@ Once assembly was complete the samplers were used in an irradiated gas release a New Tank Layout -**Software Contributions:** In order to account for the actuation delay of the solenoids a delay in a few areas of the code had to be created. Since the Arduino is single core MCU I had to learn how to do simulated multi-threading in order to allow other functions such as error checking to run during said delays. In addition to this I also fixed various bugs with the web interface as well. +**Software Contributions:** In order to account for the actuation delay of the solenoids, a delay in a few areas of the code had to be created. Since the Arduino is a single-core MCU, I had to learn how to do simulated multi-threading in order to allow other functions such as error checking to run during said delays. In addition to this, I also fixed various bugs with the web interface as well. **Safety Cap:** It was important to have a method of identifying tanks full of irradiated samples and ensure that they couldn't be accidentally opened. I designed the safety caps shown below to remove the possibility of accidental sample release. diff --git a/_posts/2023-10-3-handheld-controller.md b/_posts/2023-10-3-handheld-controller.md index 0118b916e2..9c0da2cce5 100644 --- a/_posts/2023-10-3-handheld-controller.md +++ b/_posts/2023-10-3-handheld-controller.md @@ -15,9 +15,9 @@ tags: Demonstration GIF -**Problem Statement:** Create an ergonomic controller that integrates 1 analog input and 4 digital inputs from the user. All inputs must be sent through 2 analog signals and all inputs must be useable with the hand in a closed grasp position. In addition I also had to create a ROS node for decoding the analog signals and triggering the associated functions. +**Problem Statement:** Create an ergonomic controller that integrates 1 analog input and 4 digital inputs from the user. All inputs must be sent through 2 analog signals and all inputs must be usable with the hand in a closed grasp position. In addition, I also had to create a ROS node for decoding the analog signals and triggering the associated functions. -**Project Timeline:** This project was given to me during my robotics engineering internship at [Contoro Robotics](https://www.contoro.com/). First I designed the circuit that allowed me to send 1 analog input and 4 digital inputs through 2 analog signals. Next I iterated through many prototypes of main body designs. Using feedback from my coworkers I arrived at a base design that was comfortable to hold. Unfortunately the grip still wasn't comfortable enough for continuous use over an extended period of time. I then decided to pivot to a consumer pilot stick from Logitech. I disassembled this pilot stick and implemented my circuit by modifying the included PCB as well as replacing the output USB cable with the required connector for my use case. Once the controller was ready to be implemented I mounted it on the exoskeleton and performed a calibration. Finally I created a ROS node for decoding the signals and performing the desired functions. +**Project Timeline:** This project was given to me during my robotics engineering internship at [Contoro Robotics](https://www.contoro.com/). First, I designed the circuit that allowed me to send 1 analog input and 4 digital inputs through 2 analog signals. Next, I iterated through many prototypes of main body designs. Using feedback from my coworkers, I arrived at a base design that was comfortable to hold. Unfortunately, the grip still wasn't comfortable enough for continuous use over an extended period of time. I then decided to pivot to a consumer pilot stick from Logitech. I disassembled this pilot stick and implemented my circuit by modifying the included PCB as well as replacing the output USB cable with the required connector for my use case. Once the controller was ready to be implemented, I mounted it on the exoskeleton and performed a calibration. Finally, I created a ROS node for decoding the signals and performing the desired functions. --- # Engineering Process Overview diff --git a/_posts/2024-03-11-horns-of-fury.md b/_posts/2024-03-11-horns-of-fury.md index 6fd3fd1cae..d9fe0b0387 100644 --- a/_posts/2024-03-11-horns-of-fury.md +++ b/_posts/2024-03-11-horns-of-fury.md @@ -45,19 +45,19 @@ tags: --- # Personal Contributions -**Electronics:** I was in charge of the electronics subassembly for this battlebot. In order to begin my work I first worked with the team to determine desired top speed and acceleration of the bot and the weapon. Using these numbers I figured out torque and rpm requirements for motors. Once motors were selected I was able to determine how large of a battery to select. When selecting disconnects and wires one of my main focuses was to make sure we could easily swap out all of the components quickly without soldering anything. This was crucial for competition day in case we ran into any issues with the electronics. +**Electronics:** I was in charge of the electronics subassembly for this battlebot. In order to begin my work, I first worked with the team to determine desired top speed and acceleration of the bot and the weapon. Using these numbers, I figured out torque and rpm requirements for motors. Once motors were selected, I was able to determine how large of a battery to select. When selecting disconnects and wires, one of my main focuses was to make sure we could easily swap out all of the components quickly without soldering anything. This was crucial for competition day in case we ran into any issues with the electronics. *Electronics Diagram* Electronics -**Controls:** Another thing I was tasked with for this project was selecting a transmitter and receiver for the bot and working with the driver to to customize the controls to his likings. One of the key things that I implemented was a switch for if we got flipped upside down. This switch modified all of the controls to allow our driver to continue driving as if the bot were still upright. I also implemented safety shutoffs incase the receiver ever lost signal all motors would lose power as well. +**Controls:** Another thing I was tasked with for this project was selecting a transmitter and receiver for the bot and working with the driver to customize the controls to his liking. One of the key things that I implemented was a switch for if we got flipped upside down. This switch modified all of the controls to allow our driver to continue driving as if the bot were still upright. I also implemented safety shutoffs in case the receiver ever lost signal; all motors would lose power as well. *Upside Down Controls* Upside Down Driving -**Repairs and Diagnosing Competition Day Problems:** In the days leading up to the competition one of my main focuses was thinking through any issues we might run into and setting up our repair process between matches. Since we would only have 30 minutes between matches it was important to be able to quickly diagnose any electronics issues we had while simultaneously replacing any of the other components. In order to most efficiently do this we used our spare parts to assemble a second bot so that we could quickly swap over the receiver and have another operational bot. +**Repairs and Diagnosing Competition Day Problems:** In the days leading up to the competition, one of my main focuses was thinking through any issues we might run into and setting up our repair process between matches. Since we would only have 30 minutes between matches, it was important to be able to quickly diagnose any electronics issues we had while simultaneously replacing any of the other components. In order to most efficiently do this, we used our spare parts to assemble a second bot so that we could quickly swap over the receiver and have another operational bot. *Post Competition Picture* @@ -66,7 +66,7 @@ tags: --- # Key Takeaways and Skills Utilized -This project was the some of the most fun I have had as an engineer. Working with some of my best friends to design and build an incredibly successful bot in only 5 weeks was incredibly difficult. It gave me the chance to apply so many things I had learned about in classes and also learn so much more. +This project was some of the most fun I have had as an engineer. Working with some of my best friends to design and build an incredibly successful bot in only 5 weeks was incredibly challenging. It gave me the chance to apply so many things I had learned about in classes and also learn so much more. **Skills Used** * Dynamics Analysis diff --git a/_posts/2024-04-29-rmd-final-project.md b/_posts/2024-04-29-rmd-final-project.md index 304d4a7c47..b3a8a1c59c 100644 --- a/_posts/2024-04-29-rmd-final-project.md +++ b/_posts/2024-04-29-rmd-final-project.md @@ -58,7 +58,7 @@ tags: Projectile Motion -**Electronics and Controls:** The controller for this project was an Arduino Uno which controlled a 12V DC motor through an L298N motor driver. I setup the whole electrical system and wrote the code that operated the mechanism. The buttons were used to cycle through various PWM signals and run times for each shot distance. These values were determined using the aforementioned projectile motion analysis and fine tuned using experimental data. +**Electronics and Controls:** The controller for this project was an Arduino Uno which controlled a 12V DC motor through an L298N motor driver. I set up the whole electrical system and wrote the code that operated the mechanism. The buttons were used to cycle through various PWM signals and run times for each shot distance. These values were determined using the aforementioned projectile motion analysis and fine-tuned using experimental data. *Electronics Diagram* @@ -77,7 +77,7 @@ tags: --- # Key Takeaways and Skills Utilized -This project was a great opportunity to apply my learnings in mechanism design and kinematic analysis from the course. It taught how to use preliminary calculations to form a prototype and then iterate to achieve desired motion profiles. +This project was a great opportunity to apply my learnings in mechanism design and kinematic analysis from the course. It taught me how to use preliminary calculations to form a prototype and then iterate to achieve desired motion profiles. **Skills Used** * Python programming diff --git a/_posts/2024-12-05-gaze-based-multi-agent-control.md b/_posts/2024-12-05-gaze-based-multi-agent-control.md index 1c3ca537a6..f6e14ea28e 100644 --- a/_posts/2024-12-05-gaze-based-multi-agent-control.md +++ b/_posts/2024-12-05-gaze-based-multi-agent-control.md @@ -35,11 +35,11 @@ My primary contribution to this project was the method of determining which agen **Position Based:** The simplest method of selection is based on which agent the gaze position is closest to. When other methods come up inconclusive the program defaults to this method. This method is incredibly reliable yet when the turtles are far apart it takes a while to recognize that the user has switched their focus. -**Velocity Based:** This is the primary method of selection for this program. I implemented a Kalman filter so that the velocity of the users gaze can be approximated from the time-series position data. This logic then uses the difference between the velocity vector and the vector from the gaze location to the turtle location to determine where the user is on the way to looking at. Without any constraints this method is not very robust. I constrained the logic even further to require a minimum threshold velocity to be surpassed and for the angle between the vectors to be within a certain range. This ensures that the correct agent is selected whenever velocity based selection is used. If these criteria are not met then the position based selection is used as a fallback. +**Velocity Based:** This is the primary method of selection for this program. I implemented a Kalman filter so that the velocity of the user's gaze can be approximated from the time-series position data. This logic then uses the difference between the velocity vector and the vector from the gaze location to the turtle location to determine where the user is on the way to looking. Without any constraints, this method is not very robust. I constrained the logic even further to require a minimum threshold velocity to be surpassed and for the angle between the vectors to be within a certain range. This ensures that the correct agent is selected whenever velocity-based selection is used. If these criteria are not met, then the position-based selection is used as a fallback. Velocity Based -**Classifer Model: (IN PROGRESS)** The implementation of a classifier will be explored if this project is continued and used in expanded applications. This would likely use some sort of random forest classifier. This would allow us to take ground truth data of what agent the user is attempting to control from collected samples and figure out what signals indicate a user has switched their focus. +**Classifier Model: (IN PROGRESS)** The implementation of a classifier will be explored if this project is continued and used in expanded applications. This would likely use some sort of random forest classifier. This would allow us to take ground truth data of what agent the user is attempting to control from collected samples and figure out what signals indicate a user has switched their focus. --- diff --git a/_posts/2024-12-06-senior-design-3lb-battlebot.md b/_posts/2024-12-06-senior-design-3lb-battlebot.md index 5304a070d8..6233ee1ce0 100644 --- a/_posts/2024-12-06-senior-design-3lb-battlebot.md +++ b/_posts/2024-12-06-senior-design-3lb-battlebot.md @@ -23,13 +23,13 @@ tags: Inner Structure -**Electronics:** One of the main struggles I have seen in the past with budget 3 lb battlebots is unreliable electronics that tend to fail. Because of this I pushed for our team to dedicate a larger portion of our budget to electronics. This ended up being arond 50% of our overall budget and allowed us to use high quality speed controllers for all of our motors. I designed our electronics system to have easily swappable components and no single piece was directly wired to another without an XT60 or XT30 connector. The system ran using a 3s 850 mAh battery that would last us long enough to run weapon and drive motors for a full match. My work on the electronics subassembly also included ensuring the bot met all safety requirements. These included a visible power LED, failsafe, and a manual power disconnect. +**Electronics:** One of the main struggles I have seen in the past with budget 3 lb battlebots is unreliable electronics that tend to fail. Because of this, I pushed for our team to dedicate a larger portion of our budget to electronics. This ended up being around 50% of our overall budget and allowed us to use high-quality speed controllers for all of our motors. I designed our electronics system to have easily swappable components and no single piece was directly wired to another without an XT60 or XT30 connector. The system ran using a 3s 850 mAh battery that would last us long enough to run weapon and drive motors for a full match. My work on the electronics subassembly also included ensuring the bot met all safety requirements. These included a visible power LED, failsafe, and a manual power disconnect. *Rear Image Highlighting LED* Rear of Battlebot -**Weapon Subassembly:** Our weapon was a vertical spinner with a dead axle that used a timing belt between two pulleys to transmit motion from the motor to the weapon itself. I assisted heavily in the design choices behind our weapon, primarily influencing our decision to use 2 bearings to assist in counteracting side impacts to the weapon and any resulting moments from these impacts. The weapon was mounted to a timing pulley that had 4 through holes for bolts. Both the weapon and the pulley had a bearing press fit into them which allowed the weapon to rotate freely around the axle while having good resistance to radial and moment loads. The weapon was made of AR-500 steel that was laser cut by SendCutSend. In competition our weapon was very succesful and bent the steel weapon of one of our opponents and fractured the aluminum frame of another +**Weapon Subassembly:** Our weapon was a vertical spinner with a dead axle that used a timing belt between two pulleys to transmit motion from the motor to the weapon itself. I assisted heavily in the design choices behind our weapon, primarily influencing our decision to use 2 bearings to assist in counteracting side impacts to the weapon and any resulting moments from these impacts. The weapon was mounted to a timing pulley that had 4 through holes for bolts. Both the weapon and the pulley had a bearing press-fit into them which allowed the weapon to rotate freely around the axle while having good resistance to radial and moment loads. The weapon was made of AR-500 steel that was laser cut by SendCutSend. In competition, our weapon was very successful and bent the steel weapon of one of our opponents and fractured the aluminum frame of another. *AR-500 Weapon* diff --git a/_posts/2025-04-28-senior-design-3d-printing-arm.md b/_posts/2025-04-28-senior-design-3d-printing-arm.md index 8ee9297291..d1d1b06c39 100644 --- a/_posts/2025-04-28-senior-design-3d-printing-arm.md +++ b/_posts/2025-04-28-senior-design-3d-printing-arm.md @@ -8,7 +8,7 @@ tags: - C++ - Python - Motion Planning -- Softare Architecture +- Software Architecture --- # Summary @@ -28,7 +28,7 @@ This page will primarily cover my contributions to the project including the ove --- # Software Architecture -This project was based in ROS2 using a mix of both Python and C++. I decided to use ROS2 for the project to allow the use of commonly used robotics packages such as MoveIt2 and RViz2. The primary node for the system acted as an action server for the user interface. Slicing and printing requests from the user would come in to the action server and it would then communicate with the other nodes to complete the printing process. The control of the end effector was managed within a separate node that acted as a service for the main control node to use as a client. The slicing path generator acted as an action server to feed printing paths to main control node throughout the printing process. The final node was a MoveIt move group that managed the cartesian motion planning and execution. +This project was based in ROS2 using a mix of both Python and C++. I decided to use ROS2 for the project to allow the use of commonly used robotics packages such as MoveIt2 and RViz2. The primary node for the system acted as an action server for the user interface. Slicing and printing requests from the user would come in to the action server, and it would then communicate with the other nodes to complete the printing process. The control of the end effector was managed within a separate node that acted as a service for the main control node to use as a client. The slicing path generator acted as an action server to feed printing paths to the main control node throughout the printing process. The final node was a MoveIt move group that managed the Cartesian motion planning and execution. *Software Flowchart* @@ -37,7 +37,7 @@ This project was based in ROS2 using a mix of both Python and C++. I decided to --- # Motion Planning Environment -The motion planning for this project utilizes MoveIt2 to perform cartesian motion paths along the desired printing trajectory. To account for the custom end effector used I created a URDF of our modified UR5e arm setup. This was in turn used to create a custom UR control package and a custom MoveIt configuration. +The motion planning for this project utilizes MoveIt2 to perform Cartesian motion paths along the desired printing trajectory. To account for the custom end effector used, I created a URDF of our modified UR5e arm setup. This was in turn used to create a custom UR control package and a custom MoveIt configuration. *Custom URDF* @@ -76,4 +76,4 @@ While this project was not finished during my time working on it which was limit [Final Presentation Slides](/assets/KFinalReport.pdf) -For information regarding this project, usage of the code, or questions please reach to myself by email. \ No newline at end of file +For information regarding this project, usage of the code, or questions, please reach out to me by email. \ No newline at end of file