Technology accelerating the future of Internet of Things sensor design

Jon Lawson

Engineers have been using electronic design automation (EDA) software for many years to build integrated electronics with a very high rate of success. In fact, it has become rare for a new integrated circuit product to require more than a couple of iterations before entering mass production. This is largely a result of the robustness and reliability of the simulation and verification tools available on the market today.

By contrast, new products in the MEMS market routinely require many iterative design cycles. This is not because the physics involved are not well understood; but rather because the tools available for simulation are not efficient enough to allow for full device and system verification at the rate required to keep up with the blistering pace of technology development, particularly in the consumer electronics market.

An excellent example of this challenge can be found in the fingerprint sensor industry, where mobile device OEMs are requiring more and more flexibility with regard to where they can integrate the sensor. Displays are now expanding to encompass the entire front of the device, edge-to-edge, leaving existing capacitive sensing technologies without a place to reside except for the back of the device.

Enter the ultrasonic fingerprint sensor – a device that is not only capable of imaging the ridges of a finger through the OLED display in a phone, but also of determining if the finger is actually a finger, and whether it is connected to a living person. These capabilities have set the stage for ultrasonic transducers, and specifically a MEMS variety called piezoelectric micromachined ultrasonic transducers (PMUT), to corner the consumer electronics market.

Despite their advantages, devices that use PMUTs are not simple to build. End-product performance is affected by every mechanical component in the solution, as well as the interaction between the signals from each electromechanical structure in an array of hundreds or even thousands of individual PMUTs.

Given these considerable challenges, it isn’t difficult to understand why the legacy computer aided engineering (CAE) software providers are struggling to provide solvers that can virtually prototype advanced devices in full-3D in a reasonable time-frame, and why the major players in ultrasonic fingerprint sensing are spending massive sums of money on dozens of prototyping iterations to refine their products.

The future of engineering

But this doesn’t have to be the case. In OnScale’s vision of the future of engineering, product developers are unshackled from the limitations of their desktop hardware or internal high-performance computing (HPC) clusters. They are able to bring to bear unlimited computing power to solve their most difficult problems – from their laptops.

OnScale’s offering consists of advanced fully coupled multi-physics solvers, engineered specifically to run on commodity hardware in a massively parallel fashion for more than two decades. Integrated with a highly efficient cloud backend hosted by Amazon, OnScale allows design spaces that were previously unreachable to be explored with legacy solvers.

There is perhaps no better example of how this kind of predictive power can be used to reduce cost, time to market and risk than the ultrasonic fingerprint sensor. Legacy solvers are limited to simulating simple 2D slices of such devices, forcing engineers to slog through time consuming analytical and expensive empirical analyses to see the considerable 3D effects that are not visible through the keyhole view their solvers provide.

Impressive analysis

Recently, OnScale has demonstrated the capability to simulate a full-3D ultrasonic fingerprint sensor with thousands of individual PMUTs, including not only the transducer, but also the entire OLED/glass stackup and even the finger itself. Moreover, this analysis was able to be completed in a matter of hours using a highly efficient message passing interface (MPI) to allow hundreds of cloud cores to solve multiple physical domains (electromechanical, acoustic wave, temperature) simultaneously.

The acoustic energy of every active element was modelled in the time domain such that interactions between them could be captured. As the waves propagated through the OLED and glass stack, ultimately reaching the ridges of the finger, reflections at each interface were captured and their effects on the final received signal at each PMUT element were precisely determined.

From the full-3D data, OnScale was able to reconstruct the actual fingerprint image using a digital signal processing (DSP) algorithm.

While previously an engineer would have to wait for physical prototypes and empirical data to begin to test their DSP and modify it each time a revision was made to the part, now they can optimise their algorithms on a virtual prototype before any silicon is produced. The implications of this capability are immense.

The number of prototype iterations, which cost hundreds of thousands of dollars, can be significantly reduced. Software and firmware can be written, tested and debugged using virtual data. Most importantly, foundry process and technology platform improvements can be explored well in advance, enabling fundamental process and product strategies to be mapped that had previously been so difficult to envision and execute.

Very soon, the idea of first-pass silicon in MEMS devices will become a reality as modelling and simulation capabilities approach the reliability and efficiency currently relegated to realm EDA and similar disciplines. OnScale is leading that charge, with tools designed to empower its users with the power of a supercomputer – on-demand.

The future of engineering has arrived, and it exists at the intersection of powerful, parallelised multi-physics solvers and the vast scalability of big compute in the cloud.

Ryan Diestelhorst is with Onscale.

Recent Issues