diff --git a/404.md b/404.md index 392573d811..52fea8faf6 100644 --- a/404.md +++ b/404.md @@ -3,4 +3,4 @@ title: 404 layout: 404 --- -The page you're trying to load was not found \ No newline at end of file +The page you're trying to load was not found. \ No newline at end of file diff --git a/README.md b/README.md index f38f20d8c6..3183a89fb5 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,8 @@ # John Lyle's Portfolio -## To Do List -* AugRE Work -* Drop Drone RL +A Jekyll-based portfolio website showcasing my engineering projects, experiences, and technical capabilities. + +## Current Projects +* Gaze-based multi-agent control research +* 3D printing with robotic arms +* Advanced battlebot development diff --git a/_config.yml b/_config.yml index 2a595dcd14..c54ed3112c 100644 --- a/_config.yml +++ b/_config.yml @@ -47,7 +47,7 @@ author-name: John Lyle # URL for the Image of the Author author-image: /assets/images/headshot.png # 60 Words About the Author -author-about: I am a fourth-year mechanical engineering student at UT Austin. I have experience in electronics, software, and mechanical design, and love the interdisciplinary nature of robotics. I am passionate about developing innovative R&D solutions to improve human lives and mitigate risks. +author-about: "I am a mechanical engineering master's student from UT Austin with experience in electronics, software, and mechanical design. I love the interdisciplinary nature of robotics and am passionate about developing innovative R&D solutions to improve human lives and mitigate risks." # URL to the Author's Profile (i.e., Github, Twitter, Stackoverflow, etc) author-url: https://www.linkedin.com/in/johnlyleiv @@ -99,3 +99,4 @@ exclude: - vendor/ruby/ repository: jbliv/jbliv.github.io + diff --git a/_includes/about.html b/_includes/about.html index d061f1e331..629b8332ca 100644 --- a/_includes/about.html +++ b/_includes/about.html @@ -15,12 +15,10 @@ Contact Me -
-**Software Contributions:** In order to account for the actuation delay of the solenoids a delay in a few areas of the code had to be created. Since the Arduino is single core MCU I had to learn how to do simulated multi-threading in order to allow other functions such as error checking to run during said delays. In addition to this I also fixed various bugs with the web interface as well.
+**Software Contributions:** In order to account for the actuation delay of the solenoids, a delay in a few areas of the code had to be created. Since the Arduino is a single-core MCU, I had to learn how to do simulated multi-threading in order to allow other functions such as error checking to run during said delays. In addition to this, I also fixed various bugs with the web interface as well.
**Safety Cap:** It was important to have a method of identifying tanks full of irradiated samples and ensure that they couldn't be accidentally opened. I designed the safety caps shown below to remove the possibility of accidental sample release.
diff --git a/_posts/2023-10-3-handheld-controller.md b/_posts/2023-10-3-handheld-controller.md
index 0118b916e2..9c0da2cce5 100644
--- a/_posts/2023-10-3-handheld-controller.md
+++ b/_posts/2023-10-3-handheld-controller.md
@@ -15,9 +15,9 @@ tags:
-**Problem Statement:** Create an ergonomic controller that integrates 1 analog input and 4 digital inputs from the user. All inputs must be sent through 2 analog signals and all inputs must be useable with the hand in a closed grasp position. In addition I also had to create a ROS node for decoding the analog signals and triggering the associated functions.
+**Problem Statement:** Create an ergonomic controller that integrates 1 analog input and 4 digital inputs from the user. All inputs must be sent through 2 analog signals and all inputs must be usable with the hand in a closed grasp position. In addition, I also had to create a ROS node for decoding the analog signals and triggering the associated functions.
-**Project Timeline:** This project was given to me during my robotics engineering internship at [Contoro Robotics](https://www.contoro.com/). First I designed the circuit that allowed me to send 1 analog input and 4 digital inputs through 2 analog signals. Next I iterated through many prototypes of main body designs. Using feedback from my coworkers I arrived at a base design that was comfortable to hold. Unfortunately the grip still wasn't comfortable enough for continuous use over an extended period of time. I then decided to pivot to a consumer pilot stick from Logitech. I disassembled this pilot stick and implemented my circuit by modifying the included PCB as well as replacing the output USB cable with the required connector for my use case. Once the controller was ready to be implemented I mounted it on the exoskeleton and performed a calibration. Finally I created a ROS node for decoding the signals and performing the desired functions.
+**Project Timeline:** This project was given to me during my robotics engineering internship at [Contoro Robotics](https://www.contoro.com/). First, I designed the circuit that allowed me to send 1 analog input and 4 digital inputs through 2 analog signals. Next, I iterated through many prototypes of main body designs. Using feedback from my coworkers, I arrived at a base design that was comfortable to hold. Unfortunately, the grip still wasn't comfortable enough for continuous use over an extended period of time. I then decided to pivot to a consumer pilot stick from Logitech. I disassembled this pilot stick and implemented my circuit by modifying the included PCB as well as replacing the output USB cable with the required connector for my use case. Once the controller was ready to be implemented, I mounted it on the exoskeleton and performed a calibration. Finally, I created a ROS node for decoding the signals and performing the desired functions.
---
# Engineering Process Overview
diff --git a/_posts/2024-03-11-horns-of-fury.md b/_posts/2024-03-11-horns-of-fury.md
index 6fd3fd1cae..d9fe0b0387 100644
--- a/_posts/2024-03-11-horns-of-fury.md
+++ b/_posts/2024-03-11-horns-of-fury.md
@@ -45,19 +45,19 @@ tags:
---
# Personal Contributions
-**Electronics:** I was in charge of the electronics subassembly for this battlebot. In order to begin my work I first worked with the team to determine desired top speed and acceleration of the bot and the weapon. Using these numbers I figured out torque and rpm requirements for motors. Once motors were selected I was able to determine how large of a battery to select. When selecting disconnects and wires one of my main focuses was to make sure we could easily swap out all of the components quickly without soldering anything. This was crucial for competition day in case we ran into any issues with the electronics.
+**Electronics:** I was in charge of the electronics subassembly for this battlebot. In order to begin my work, I first worked with the team to determine desired top speed and acceleration of the bot and the weapon. Using these numbers, I figured out torque and rpm requirements for motors. Once motors were selected, I was able to determine how large of a battery to select. When selecting disconnects and wires, one of my main focuses was to make sure we could easily swap out all of the components quickly without soldering anything. This was crucial for competition day in case we ran into any issues with the electronics.
*Electronics Diagram*
-**Controls:** Another thing I was tasked with for this project was selecting a transmitter and receiver for the bot and working with the driver to to customize the controls to his likings. One of the key things that I implemented was a switch for if we got flipped upside down. This switch modified all of the controls to allow our driver to continue driving as if the bot were still upright. I also implemented safety shutoffs incase the receiver ever lost signal all motors would lose power as well.
+**Controls:** Another thing I was tasked with for this project was selecting a transmitter and receiver for the bot and working with the driver to customize the controls to his liking. One of the key things that I implemented was a switch for if we got flipped upside down. This switch modified all of the controls to allow our driver to continue driving as if the bot were still upright. I also implemented safety shutoffs in case the receiver ever lost signal; all motors would lose power as well.
*Upside Down Controls*
-**Repairs and Diagnosing Competition Day Problems:** In the days leading up to the competition one of my main focuses was thinking through any issues we might run into and setting up our repair process between matches. Since we would only have 30 minutes between matches it was important to be able to quickly diagnose any electronics issues we had while simultaneously replacing any of the other components. In order to most efficiently do this we used our spare parts to assemble a second bot so that we could quickly swap over the receiver and have another operational bot.
+**Repairs and Diagnosing Competition Day Problems:** In the days leading up to the competition, one of my main focuses was thinking through any issues we might run into and setting up our repair process between matches. Since we would only have 30 minutes between matches, it was important to be able to quickly diagnose any electronics issues we had while simultaneously replacing any of the other components. In order to most efficiently do this, we used our spare parts to assemble a second bot so that we could quickly swap over the receiver and have another operational bot.
*Post Competition Picture*
@@ -66,7 +66,7 @@ tags:
---
# Key Takeaways and Skills Utilized
-This project was the some of the most fun I have had as an engineer. Working with some of my best friends to design and build an incredibly successful bot in only 5 weeks was incredibly difficult. It gave me the chance to apply so many things I had learned about in classes and also learn so much more.
+This project was some of the most fun I have had as an engineer. Working with some of my best friends to design and build an incredibly successful bot in only 5 weeks was incredibly challenging. It gave me the chance to apply so many things I had learned about in classes and also learn so much more.
**Skills Used**
* Dynamics Analysis
diff --git a/_posts/2024-04-29-rmd-final-project.md b/_posts/2024-04-29-rmd-final-project.md
index 304d4a7c47..b3a8a1c59c 100644
--- a/_posts/2024-04-29-rmd-final-project.md
+++ b/_posts/2024-04-29-rmd-final-project.md
@@ -58,7 +58,7 @@ tags:
-**Electronics and Controls:** The controller for this project was an Arduino Uno which controlled a 12V DC motor through an L298N motor driver. I setup the whole electrical system and wrote the code that operated the mechanism. The buttons were used to cycle through various PWM signals and run times for each shot distance. These values were determined using the aforementioned projectile motion analysis and fine tuned using experimental data.
+**Electronics and Controls:** The controller for this project was an Arduino Uno which controlled a 12V DC motor through an L298N motor driver. I set up the whole electrical system and wrote the code that operated the mechanism. The buttons were used to cycle through various PWM signals and run times for each shot distance. These values were determined using the aforementioned projectile motion analysis and fine-tuned using experimental data.
*Electronics Diagram*
@@ -77,7 +77,7 @@ tags:
---
# Key Takeaways and Skills Utilized
-This project was a great opportunity to apply my learnings in mechanism design and kinematic analysis from the course. It taught how to use preliminary calculations to form a prototype and then iterate to achieve desired motion profiles.
+This project was a great opportunity to apply my learnings in mechanism design and kinematic analysis from the course. It taught me how to use preliminary calculations to form a prototype and then iterate to achieve desired motion profiles.
**Skills Used**
* Python programming
diff --git a/_posts/2024-12-05-gaze-based-multi-agent-control.md b/_posts/2024-12-05-gaze-based-multi-agent-control.md
index 1c3ca537a6..f6e14ea28e 100644
--- a/_posts/2024-12-05-gaze-based-multi-agent-control.md
+++ b/_posts/2024-12-05-gaze-based-multi-agent-control.md
@@ -35,11 +35,11 @@ My primary contribution to this project was the method of determining which agen
**Position Based:** The simplest method of selection is based on which agent the gaze position is closest to. When other methods come up inconclusive the program defaults to this method. This method is incredibly reliable yet when the turtles are far apart it takes a while to recognize that the user has switched their focus.
-**Velocity Based:** This is the primary method of selection for this program. I implemented a Kalman filter so that the velocity of the users gaze can be approximated from the time-series position data. This logic then uses the difference between the velocity vector and the vector from the gaze location to the turtle location to determine where the user is on the way to looking at. Without any constraints this method is not very robust. I constrained the logic even further to require a minimum threshold velocity to be surpassed and for the angle between the vectors to be within a certain range. This ensures that the correct agent is selected whenever velocity based selection is used. If these criteria are not met then the position based selection is used as a fallback.
+**Velocity Based:** This is the primary method of selection for this program. I implemented a Kalman filter so that the velocity of the user's gaze can be approximated from the time-series position data. This logic then uses the difference between the velocity vector and the vector from the gaze location to the turtle location to determine where the user is on the way to looking. Without any constraints, this method is not very robust. I constrained the logic even further to require a minimum threshold velocity to be surpassed and for the angle between the vectors to be within a certain range. This ensures that the correct agent is selected whenever velocity-based selection is used. If these criteria are not met, then the position-based selection is used as a fallback.
-**Classifer Model: (IN PROGRESS)** The implementation of a classifier will be explored if this project is continued and used in expanded applications. This would likely use some sort of random forest classifier. This would allow us to take ground truth data of what agent the user is attempting to control from collected samples and figure out what signals indicate a user has switched their focus.
+**Classifier Model: (IN PROGRESS)** The implementation of a classifier will be explored if this project is continued and used in expanded applications. This would likely use some sort of random forest classifier. This would allow us to take ground truth data of what agent the user is attempting to control from collected samples and figure out what signals indicate a user has switched their focus.
---
diff --git a/_posts/2024-12-06-senior-design-3lb-battlebot.md b/_posts/2024-12-06-senior-design-3lb-battlebot.md
index 5304a070d8..6233ee1ce0 100644
--- a/_posts/2024-12-06-senior-design-3lb-battlebot.md
+++ b/_posts/2024-12-06-senior-design-3lb-battlebot.md
@@ -23,13 +23,13 @@ tags:
-**Electronics:** One of the main struggles I have seen in the past with budget 3 lb battlebots is unreliable electronics that tend to fail. Because of this I pushed for our team to dedicate a larger portion of our budget to electronics. This ended up being arond 50% of our overall budget and allowed us to use high quality speed controllers for all of our motors. I designed our electronics system to have easily swappable components and no single piece was directly wired to another without an XT60 or XT30 connector. The system ran using a 3s 850 mAh battery that would last us long enough to run weapon and drive motors for a full match. My work on the electronics subassembly also included ensuring the bot met all safety requirements. These included a visible power LED, failsafe, and a manual power disconnect.
+**Electronics:** One of the main struggles I have seen in the past with budget 3 lb battlebots is unreliable electronics that tend to fail. Because of this, I pushed for our team to dedicate a larger portion of our budget to electronics. This ended up being around 50% of our overall budget and allowed us to use high-quality speed controllers for all of our motors. I designed our electronics system to have easily swappable components and no single piece was directly wired to another without an XT60 or XT30 connector. The system ran using a 3s 850 mAh battery that would last us long enough to run weapon and drive motors for a full match. My work on the electronics subassembly also included ensuring the bot met all safety requirements. These included a visible power LED, failsafe, and a manual power disconnect.
*Rear Image Highlighting LED*
-**Weapon Subassembly:** Our weapon was a vertical spinner with a dead axle that used a timing belt between two pulleys to transmit motion from the motor to the weapon itself. I assisted heavily in the design choices behind our weapon, primarily influencing our decision to use 2 bearings to assist in counteracting side impacts to the weapon and any resulting moments from these impacts. The weapon was mounted to a timing pulley that had 4 through holes for bolts. Both the weapon and the pulley had a bearing press fit into them which allowed the weapon to rotate freely around the axle while having good resistance to radial and moment loads. The weapon was made of AR-500 steel that was laser cut by SendCutSend. In competition our weapon was very succesful and bent the steel weapon of one of our opponents and fractured the aluminum frame of another
+**Weapon Subassembly:** Our weapon was a vertical spinner with a dead axle that used a timing belt between two pulleys to transmit motion from the motor to the weapon itself. I assisted heavily in the design choices behind our weapon, primarily influencing our decision to use 2 bearings to assist in counteracting side impacts to the weapon and any resulting moments from these impacts. The weapon was mounted to a timing pulley that had 4 through holes for bolts. Both the weapon and the pulley had a bearing press-fit into them which allowed the weapon to rotate freely around the axle while having good resistance to radial and moment loads. The weapon was made of AR-500 steel that was laser cut by SendCutSend. In competition, our weapon was very successful and bent the steel weapon of one of our opponents and fractured the aluminum frame of another.
*AR-500 Weapon*
diff --git a/_posts/2025-04-28-senior-design-3d-printing-arm.md b/_posts/2025-04-28-senior-design-3d-printing-arm.md
index 8ee9297291..d1d1b06c39 100644
--- a/_posts/2025-04-28-senior-design-3d-printing-arm.md
+++ b/_posts/2025-04-28-senior-design-3d-printing-arm.md
@@ -8,7 +8,7 @@ tags:
- C++
- Python
- Motion Planning
-- Softare Architecture
+- Software Architecture
---
# Summary
@@ -28,7 +28,7 @@ This page will primarily cover my contributions to the project including the ove
---
# Software Architecture
-This project was based in ROS2 using a mix of both Python and C++. I decided to use ROS2 for the project to allow the use of commonly used robotics packages such as MoveIt2 and RViz2. The primary node for the system acted as an action server for the user interface. Slicing and printing requests from the user would come in to the action server and it would then communicate with the other nodes to complete the printing process. The control of the end effector was managed within a separate node that acted as a service for the main control node to use as a client. The slicing path generator acted as an action server to feed printing paths to main control node throughout the printing process. The final node was a MoveIt move group that managed the cartesian motion planning and execution.
+This project was based in ROS2 using a mix of both Python and C++. I decided to use ROS2 for the project to allow the use of commonly used robotics packages such as MoveIt2 and RViz2. The primary node for the system acted as an action server for the user interface. Slicing and printing requests from the user would come in to the action server, and it would then communicate with the other nodes to complete the printing process. The control of the end effector was managed within a separate node that acted as a service for the main control node to use as a client. The slicing path generator acted as an action server to feed printing paths to the main control node throughout the printing process. The final node was a MoveIt move group that managed the Cartesian motion planning and execution.
*Software Flowchart*
@@ -37,7 +37,7 @@ This project was based in ROS2 using a mix of both Python and C++. I decided to
---
# Motion Planning Environment
-The motion planning for this project utilizes MoveIt2 to perform cartesian motion paths along the desired printing trajectory. To account for the custom end effector used I created a URDF of our modified UR5e arm setup. This was in turn used to create a custom UR control package and a custom MoveIt configuration.
+The motion planning for this project utilizes MoveIt2 to perform Cartesian motion paths along the desired printing trajectory. To account for the custom end effector used, I created a URDF of our modified UR5e arm setup. This was in turn used to create a custom UR control package and a custom MoveIt configuration.
*Custom URDF*
@@ -76,4 +76,4 @@ While this project was not finished during my time working on it which was limit
[Final Presentation Slides](/assets/KFinalReport.pdf)
-For information regarding this project, usage of the code, or questions please reach to myself by email.
\ No newline at end of file
+For information regarding this project, usage of the code, or questions, please reach out to me by email.
\ No newline at end of file