Dr. Natalie Hernandez has been a Product Engineer at DfR Solutions since November 2016. Before, she completed her PhD in Physics at Lehigh University and served as a graduate research assistant working on spectroscopic studies of rare-earth doped wide bandgap semiconductor materials, and has since made the jump to electronics reliability engineering. After 7 months in her new role, here are some of the key takeaways she’s learned about the industry.
With a background in Experimental Condensed Matter Physics, I was excited to join DfR Solutions and enter an entirely new field of study for me: electronics reliability engineering. Previously, I was surrounded by laboratory equipment and an abundance of lasers, as my curiosities were at the microscopic level. That being said, when I entered the electronics industry, imagine my surprise at realizing that circuits were not the perfectly illustrated systems taught in my introductory physics courses, and that Kirchoff’s laws only governed idealized systems! It didn’t take long for me to find out that resistors can be incredibly tiny—in fact, the creation of 03015 components was recently announced—and by now, I’d only be slightly phased by the prospect of a component being comparable in size to a solder ball.
Physics of Failure Testing
As the product engineer working with Sherlock Automated Design Analysis™ software, my first experience with reliability testing was with simulation being the key to an effective design process. However, while simulation is becoming more prevalent and incorporated in industry practices today, that hasn’t always been the case. Through my explorations of reliability engineering, I discovered the older industry standards and “one size fits all” models which involved extraneous testing that led to general guidelines for electronic components. I learned that the Physics of Failure (PoF) approach is a more recent methodology that relies on the application of physical models to provide predictions based on usage environments, and it offers more opportunities to perfect designs and isolate failures before testing.
Looking at today’s electronic devices, electronic components can fail spectacularly in quite a few ways due to a variety of failure mechanisms. Failures due to solder fatigue are especially interesting since these are package specific and mostly rely on coefficient of thermal expansion (CTE) mismatch between interacting materials. Still, a lead out of place or the simple existence of a ball grid array (BGA) on your board can be a source of headache and confusion. Thankfully, we have the simulation tools mentioned above that can identify these problems before they happen.
On that note, as a Physicist, I feel obligated to mention my journey into the world of FEA modeling—what an eye opener! With FEA modeling, a user is able to approximate their printed circuit boards (PCBs) with a high degree of certainty, perform a mesh convergence, and decipher which components will be the most problematic. Some naysayers suggest that a simulation can be adapted to any data, but to them I reply: your predictions are only as good as the input data.
The world of electronics reliability is an incredibly dynamic and inventive industry with a wide range of innovations being made every day. From accounting for a multi-chip module on your board to approximating a PCB aimed for the next trip to Mars, there are lots of exciting things happening in the industry right now. I’ve only touched on a couple of topics I’ve explored so far, but I know there’s still quite a bit to learn, and even more to create (anyone working on electrolytic capacitor failures?!).
If you’re curious to learn more about how you can help your devices avoid one of today’s most common failures, make sure to watch our free webinar on exploring vibration fatigue of electronic assemblies. Click the button below to access!