Elysium Robotics Unlocks Deep Process Insights by Leveraging Six Sigma Tools

Elysium Robotics is an early-stage company headquartered in Austin, TX, developing low-cost, high-performance muscle-like actuator technology for robotic applications. Using industry-proven problem-solving techniques, we accelerated their speed of learning and process improvement efforts.


8x increase in the rate of new process knowledge generation

75% reduction in cycle time in a sub-process

Challenge:

After aggressively working to refine its microfiber geometry, Elysium Robotics wanted to assess the impact of a wide range of variables on fiber performance. Since the fabrication process is a first-of-its-kind and undergoing active development, questions abound. With a multitude of parameters changing between fabrication runs and individual samples, it was very difficult to compare performance results and make actionable conclusions with a high degree of confidence. The team was concerned about the complexities arising from its multi-step process, because effects from different inputs across different steps can interact. Instead of waiting until the end of the entire process to evaluate the effects, the team needed to assess sub-step process outputs in a nimble and targeted way to continue optimizing the process.

Countermeasures:

After working closely with Elysium Robotics to analyze their historical data, we learned that:

  • Geometry data did not strongly correlate with performance as originally expected. This indicates that there are other major sources of variation that are masking the effects.

  • Samples from the same upstream process do not behave similarly.  This indicates that variations in downstream processes may significantly affect the performance.

To further understand and address these variations, we recommended the following:

  • Understand the root cause by identifying the failure mechanisms for each sample by visual observation 

  • Establish input-output relationships  

    • Track sample processing data in addition to existing geometry metrics

    • Identify relevant non-destructive performance metrics (key process output variables) for sub-steps of the pipeline for experimentation and process monitoring 

  • Accelerate the pace of learning: rather than testing one factor at a time, we worked together to develop a framework to perform experiments using the Design of Experiments (DOE) approach, where multiple factors are investigated at the same time. Assess the results, adjust, and iterate quickly.

Value created:

This approach, along with the team’s efforts to track sample measurements and processing history individually, resulted in several immediate and long-term technical and operational benefits:

  • 8X increase in the rate of new process knowledge generation: The DOE approach gave the team the ability to ask specific questions related to process inputs and outputs and make conclusions quantitatively with confidence.

  • The first DOE highlighted opportunities for improvement, and the team moved swiftly to mistake-proof and standardize different parts of the process which affected the initial samples 

  • 75% reduction in cycle time: Deep dives into one downstream process showed that the cycle time for one step can be reduced from days to hours, resulting in significant time savings

Lessons learned:

While the first DOE probed factors spanning the entire fabrication process, it highlighted areas that required further focused experiments for deeper understanding. The subsequent studies led to a wealth of insights and improvement ideas which are expected to enhance the overall performance. Continuing to practice Plan-Do-Check Act (PDCA) and balancing breadth vs. depth will be key to further developing and optimizing the process efficiently.

Next
Next

Osmoses Uses Process Mapping and Cause and Effects Matrix to Evaluate Process Risks