By Paul Werbaneth, MEMS Industry Group
Submitted by The Southwest Center for Microsystems Education
The Southwest Center for Microsystems Education, a National Science Foundation Advanced Technological Education Center, is working on a project to better understand the current state of the micro and nanotechnology based industry technician workforce. Through this project, we aim to enable our center to best support Community Colleges’ efforts to start micro and nano technology programs that use SCME developed curricula.
One goal of this project will be a map of the related, hi-tech industries relative to their local Community Colleges. We can then identify which regions our programs will make the greatest impact. This allows us to advocate and support the adoption of micro and nano education by Community Colleges on behalf of their regional micro and nano and related industries.
Our second goal will be a trend analysis of several mapped industries. The SCME has divided the micro-nano related industries into several categories based on specialty and industry revenue. We aim to identify at least ten companies in each bracket and to determine their workforce needs so that we can target our educational impact efforts to yield the best results for both industry and education! These trends are presented to the Community Colleges near micro and nano tech related clusters, to provide a justification for incorporating microsystems based curricula into their programs. This enables the SCME to distribute scarce educational resources into the educational institutions where their impact will be the highest, resulting in a more informed and capable workforce.
This is where we need your help! As leaders in MEMS and related industries, please completing the survey found by clicking the following link:
Aggregate findings will be shared with you as well as information pertaining to educational resources that will assist you as you build your technician workforce pipeline and enable you to be in a better position to plan workforce growth. Please consider collaborating with SCME to support our shared industrial workforce educational improvement goals!
Previous MIG Blog:
Two weeks ago, MIG had a wonderful time at SEMICON West, participating in a variety of constructive and gratifying industry-focused networking opportunities. The Jam-packed event saw both MIG and its members engage in everything from demonstrations to cocktail receptions.
Over 30 member companies exhibited on the show floor, showing off their latest in equipment, materials, packaging solutions, design automation tools, foundry services, product development and R&D, while still more members busied themselves with private appointments off of the show floor.
MIG welcomed increased traffic at our booth, showing just how relevant MEMS has become, while reinforcing why an industry association is the key to connecting to partners to increase business opportunities. The MIG team enthusiastically spoke about the benefits of joining MEMS Industry Group while promoting our members to interested individuals.
SEMI hosted their traditional MEMS content entitled, “Next Generation MEMS”, on Tuesday, July 8th. This content featured all MIG member and partner companies, including Yole Développement, Qulacomm, Silex Microsystems, GE Global Research, Si-Ware, SolMateS, NIST and EV Group. Shared content emphasized new challenges and opportunities for the MEMS supply chain to meet the needs of the expanding range of mobile devices, wearables and smart objects in the Internet of Things.
MIG also hosted its increasingly-popular cocktail party at Restaurant Lulu on Wednesday night, July 9th. With over 300 people in attendance, it was THE place to be for MIG members to meet and network.
MIG would like to thank our Platinum Sponsor Plan Optik and Bronze Sponsors C2MI, Expertech and Oxford Instruments for making SEMICON West and the cocktail party possible. If you want to relive the memories or are curious about what you missed, be sure to check out photos from the event.
MIG’s very own Karen Lightman recently graced the front page of IVAM Microtechnology Network‘s INNO magazine with her thorough discussion of MEMS and wearables. In the piece, Karen talks about the US’ disposition to focus on wearables technology, while other areas of the world spend their time and resources elsewhere, and how we’re backing these efforts with crowdfunding support. To read Karen’s thoughts on this explosive industry, head to ivam.de and download the pdf!
Guest post by Tony Massimini, Semico Research
There has been a great deal of activity among companies within the sensor fusion ecosystem. Mergers and acquisitions are changing the competitive landscape.
As a quick background, sensor fusion is the technology of combining data from multiple sensors and deriving intelligence from that data. It is the foundation for motion tracking, navigation, context awareness, location based services, augmented reality and more. It is the basis for future innovative applications. The brains behind sensor fusion is in the algorithms. This is usually embedded in a 32-bit microcontroller core or similarly powerful processing device, known as a sensor hub.
In May 2014, Fairchild announced the acquisition of Xsens the Dutch company known for motion tracking software. Xsens has been doing motion tracking for film and other such applications. It has modules with low cost consumer grade inertial motion MEMS sensors from STMicroelectronics. At the time of the acquisition, Fairchild also announced that it would be bringing MEMS sensors to market soon as well.
In the last couple of weeks there has been a flurry of activity amongst the sensor fusion ecosystem players. On June 24, 2014 Audience announced it would acquire Sensor Platforms. The buyout is expected to be complete by mid-July 2014. On July 7, 2014 InvenSense announced it was acquiring two companies, Movea and Trusted Positioning, Inc. These acquisitions are to be finalized by the end of September 2014.
Audience is a relatively small company with about $150 million in annual sales. It has been known mainly for voice and sound processing. It was designed into the Apple iPhone 4S but did not maintain its design win in Apple’s next generation iPhone. Sensor Platforms is a third-party sensor fusion algorithm developer. The company has been focused on context awareness and indoor navigation. Sensor Platforms licenses its algorithm to OEMs and to sensor hub chip vendors. Audience had been working with Sensor Platforms for its always-on sensor fusion for voice and motion and decided to acquire the company for $41 million. The name of Sensor Platforms will no longer be used following completion of the acquisition.
Audience also has a motion processor, MQ100, which will launch late 2014. This is a DSP based device which will function as a sensor hub controller. The sensor fusion algorithm developed with Sensor Platforms will be embedded in MQ100.
InvenSense has been a fast growing company delivering inertial motion sensors (gyroscope, accelerometer magnetometer) in multi-chip packages with its digital motion processor. The company has been focused on motion tracking and pedestrian navigation for mobile devices and wearables. At the end of 2013 it acquired the microphone business of Analog Devices. Movea, like Sensor Platforms, is a sensor fusion algorithm developer which licenses the algorithm to OEMs and sensor hub controller vendors. Movea provides of software for ultra-low power location, activity tracking and context sensing. Its IP is found in consumer mobile (smartphones and tablets), TV interaction and wearable sports & fitness applications. Movea’s context analysis uses both motion and audio sensors.
Trusted Positioning Inc. (TPI) is a software company providing indoor/outdoor positioning solution for mobile and wearable devices. This has been a key area of development for InvenSense. TPI’s platform also provides inertial navigation software solutions for in-vehicle navigation, personnel tracking, and machine guidance and control.
Semico has stated several times that the value for Sensor Fusion is in the algorithm. The average selling prices for the sensors are falling rapidly. The hardware is becoming more of a commodity.
These acquisitions show that the chip vendors want to add value to their products.
Sensor fusion is expanding to include more data sources. Sound for always-on context awareness and to provide spatial awareness is the next step. Therefore, the microphones are key elements. More sensors for biological and environmental data will come into use. InvenSense, Audience and Fairchild want to integrate more IP in their respective technologies.
Sensor fusion is moving beyond smartphones and tablets into wearables and other products. At a MEMS Industry Group panel at the Consumer Electronics Show in Las Vegas, Jan. 2014, Semico stated that wearables with 9-axis or more and sensor fusion features would be a high growth market. It was at CES 2014 that InvenSense and Movea in separate announcements revealed they had reference platforms for wearables with 9+ axis sensor fusion. Clearly both companies were on the same page at that time.
Impact on the Market
Less than a year ago, there were four companies licensing sensor fusion algorithms: PNI Sensor, Sensor Platforms, Movea and Hillcrest Labs.
In June 2013 PNI announced an ASIC, Sentral, which embeds its algorithm. It still licenses its algorithm but on a selective and strategic basis.
Following these recent acquisitions, it is assumed that licensing commitments involving Sensor Platforms and Movea will be honored. It seems to Semico that it is unlikely future licenses will be extended. Both InvenSense and Audience will have sensor hub controllers in competition with other companies. Thus, the only remaining independent third party sensor fusion developer without a competing chip is Hillcrest Labs.
Hillcrest Labs has a wide customer base. In March 2014, it was announced that Hillcrest Labs and Bosch Sensortec were collaborating on a sensor hub solution for head mounted displays and wearable devices. Following the recent announcements concerning their competitors, Hellcrest Labs has seen an uptick in interest for their products.
OEMS want options and do not want to be locked into one supplier. Semico believes that one option sensor hub vendors and OEMs will pursue is to develop their own in-house sensor fusion algorithms. However, this expertise is in short supply. This is a specialized area of study. Even with a strong team it could take one to two years to get up to speed.
The market for sensor fusion and sensor hub controllers is growing rapidly. Smartphones are currently the largest market, but the technology is being leveraged into other areas, especially wearable devices for sport, health, fitness and medical.
Semico forecasts that the market for sensor hub controllers will reach 2.5 billion units by 2018, CAGR (’13 to ’18) 27.4%. The wearable market will see CAGR of 114% reaching over 300 million units for devices with 9-axes or more.
MEMS revenues due to sensor fusion will grow to $7.7 billion by 2018 CAGR (’13 to ’18) of 20.3%. In 2013 MEMS revenues in sensor fusion applications account for 23.6% of the total MEMS market. By 2018, this will grow to 34% of the MEMS market.
For more detailed information on the sensor fusion market and the companies mentioned in this blog, please contact Rick Volgelei at firstname.lastname@example.org.
Guest post by Eric Levy-Myers
This was the 30th anniversary of the Hilton Head conference and the mood was one of amazement of how far the industry has come since the first meeting 30 years ago. The conference chairman noted that solid state sensors have taken over the world if only evidenced by the fact there are more smart phones than tooth brushes in the world. This set the stage for an underlying question of the conference: what happens to MEMS in the next 30 years? The Rump Session on Wednesday evening addressed the topic with special speakers.
The first day of the conference focused on Bio MEMS. In the first plenary session, Dr. Oliver Paul of the University of Freiburg spoke of directly linking to the brain to both sense and stimulate neurons. He noted with some humor that the brain has 10 to the 11th neurons. So given the number of neurons we can sense today, and a Moore’s law for probe sensors that doubles every 7 years, the curve says that we will be able to sense all the brains neurons by 2240.
The meat of his talk focused on four areas of invasive brain systems:
- Epicardial Grids that lay on the surface of the brain to control external machines such as robots for paraplegics to feed themselves.
- Array implants to sample many points in the brain.
- Deep Brain Probes to regulate diseases such as Parkinson’s.
- New Optical Stimulation technology that stimulate neurons based on wavelengths tuned to specific neuron types.
There are many challenges to getting these technologies into broad use, not least of which is the brain’s immune system attacks the probes rendering them useless in weeks or months. The research papers presented in the later sessions detailed how researchers are trying to overcome these and other challenges with Biomedical and Cellular Devices, and Bioassays.
Day two at the conference focused on the physical aspects of MEMS devices and fabrication. The plenary speaker Dr. Robert Carpick of the University of Pennsylvania introduced us to a term that most people had not heard of, “Tribology”. As he explained; “We did not like the term ‘science and engineering of interacting surfaces in relative motion’ so we grabbed Greek words to make the word Tribology.” His thesis was that at the macro level, scientists and engineers understand how surfaces in contact interact. They have methods to reduce the effects of this contact such as friction and material exchange. But at the MEMS level much less is known. This is one reason MEMS devices avoid contact points and why MEMS manufacturers can be so frustrated by stiction or stickiness. Dr. Carpick explained several areas that hold promise to allow MEMS parts to touch and rub indefinitely without ill effect. One method was to have a sealed system fill with alcohol. The research papers in the technical session extended the topics to include Materials and Surfaces, Fabrication & Materials and Magnetic Transducers.
The Day three plenary speaker Dr. Kurt Petersen brought us his vast experience in successful entrepreneurship with MEMS companies and shared his lessons of what makes for successful startups. He set a very optimistic tone for the future of MEMS, one that is bright but not a given. So he offered many juicy tidbits for anyone who wants to successfully start, run and exit a business. There were too many to repeat here, but these stood out:
- Have a great team that is persistent and dedicated to the company’s vision.
- Get your product into production fast. This fit well with the advice from the Sunday Workshop session where “fail fast so you can learn and adapt fast” was a theme.
- Know your market inside and out because the investors will.
- Inventing is great, but designing for manufacturing and efficient production is probably more important. You cannot make money if you cannot make and test it economically.
The afternoon research papers covered the latest research in High Q Resonators and Resonant Systems, promising, as did all the papers, much more MEMS innovation to come in the future.
The Rump Session highlighted the Sigma Group, a collection of SciFi writers that are distinguished by their previous careers as scientists, engineers and program managers. They also use these skills to advise the government about future issues of concern and opportunity. They spent the week talking to participants in the conference to gather data, so in the session they offered many insights. After beers they shared even more interesting ideas and interactions with the audience!
Since the chair of the panel and the writers noted that SciFi correctly predicted most aspects of the internet and sensors we use today, we can perhaps assume today’s wild predictions are not as wild as we think. Perhaps the most interesting idea had to do with the brain systems discussed on day one. Why not implant something that grows fiber into the nose that will enter the brain with millions of micro -strands that can act as probes. Overall, attendees and presenters at Hilton Head 2014 expressed much optimism that the MEMS industry will continue to grow into more unexpected parts of our lives as we move to a world of trillions of sensors and the internet of things.
The next conference is in Anchorage Alaska in June 2015. Hope to see you all there.
Guest post by Mike Stanley, Systems Engineer at Freescale
Back in February, I wrote an article describing the Xtrinsic sensor fusion library for Kinetis MCUs. Over the intervening months, we’ve made a number of improvements:
- Demo and Development versions of the kit have been consolidated into a single installer that covers all options.
- The limited “Evaluation” version has been removed. In its place, we offer free board-locked licenses tied to your Freedom development board. Licenses are generated automatically during the installation procedure. You now have access to the full development version with your first download.
- We’ve added support for two new base boards, bringing the total to four: FRDM-KL25Z, FRDM-KL26Z, FRDM-K20D50M and FRDM-K64F.
- We’ve updated the Xtrinsic Sensor Fusion Toolbox for Android to support the new boards. We also added several neat new features I’ll mention below.
- We’ve published our Xtrinsic Sensor Fusion Toolbox for Windows. It’s not a clone of the Android variant, although there are some common features. It goes will beyond that tool, offering a deeper understanding into some of the underlying calibration and fusion routines.
- We’ve reworked the Android app landing page into a one-stop-shop for all things related to sensor fusion. Visit http://www.freescale.com/sensorfusion to find convenient links for everything you’ll need to get your project started. That includes all of the above, plus training materials, and a link to the Freescale Software Services group. They can provide quotes for production software licenses and custom consulting work.
Figure 1 will look familiar to readers who have experimented with the Xtrinsic Sensor Fusion Toolbox for Android. The rotating PCB display shown here was inspired by that app. The Windows version gives you some really nice additions. First and foremost are support (via UART/USB wired connections) for the FRDM-FXS-9AXIS and FRDM-FXS-MULTI sensor boards. Unlike the FRDM-FXS-MULTI-B board, these do not have Bluetooth modules, and cannot be used with the Android version of the toolbox. That’s no problem for the Windows variant, which uses the virtual serial port feature of the OpenSDA interface to talk with the boards. Simply plug your boards into your Windows machine, start the application and click the “Auto Detect” button you see in the upper right of the figure. The application will cycle through your PCs serial ports until it finds one connected to a Freedom board and running the template app from the Xtrinsic Sensor Fusion Library for Kinetis MCUs. And if you have a Bluetooth enabled PC, pair it to your FRDM-FXS-MULTI-B and run wirelessly. The communications interface is the same from the perspective of the Windows GUI.
Figure 1: Xtrinsic Sensor Fusion Toolbox for Windows – Device View
Just like the Android version, you can select from a variety of fusion algorithms. Also shown are the version of embedded firmware running on your Freedom board, along with the type of board (assuming you have debug packets enabled on the board).
Figure 2: Xtrinsic Sensor Fusion Toolbox for Windows – Sensors View
Figure 2 shows you the “Sensors” view of the application. Here you have current values and value versus time plots for raw accelerometer and gyro readings, plus calibrated magnetometer.
Figure 3: Xtrinsic Sensor Fusion Toolbox for Windows – Dynamics View
The “Dynamics” view, shown in Figure 4, lets you look at some of the virtual sensor outputs from the sensor fusion library. These include orientation in roll/pitch/compass heading form, angular velocity and acceleration. You might wonder what the difference is between “angular velocity” and the gyro readings on the “Sensors” page. If your algorithm selection supports a physical gyro, then the values in Figure 3 have had gyro offsets subtracted from them. If your algorithm does not include gyro support, then the angular velocity included here is the result of a “virtual gyro calculation” (see “Building a virtual gyro“).
The accelerometer reading on the “Sensors” page included the effects of both gravity and linear acceleration. The “Acceleration” item on the “Dynamics” page has had the effects of gravity removed, so it represents only the linear acceleration of your board.
Figure 4: Xtrinsic Sensor Fusion Toolbox for Windows – Magnetics View
I think Figure 4 shows the neatest feature introduced in the toolbox. Those of you who have seen prior generations of Freescale magnetometer demos will recognize computed hard and soft iron correction coefficients on the left, along with our “magnetic calibration meter”. What’s new is the 3D-to-2D projection shown on the right. These are the measured data points selected by the magnetic calibration library for use in determining the correction coefficients. Ideally, the figure should be circular in shape, be centered at 0,0 and have a radius equal to the magnitude of the earth magnetic field. Nearby magnets, fixed spatially relative to the sensor, will shift the center to some non-zero value. Ferrous materials, fixed spatially relative to the sensor, will distort the circle into an ellipsoid, and possibly rotate it. If sources of interference are not fixed relative to the sensor, you’ll still see distortion, but it will not behave in as predictable a fashion, and isn’t as easily corrected. It’s educational to bring your board near sources of magnetic interference, and watch how the constellation will distort, then self-repair over time.
Figure 5: Xtrinsic Sensor Fusion Toolbox for Android – Device View
Figures 5 and 6 are screen dumps from the latest version of the Xtrinsic Sensor Fusion Toolbox for Android. If you enable display of debug packet information in the preferences screen, you’ll get additional information displayed on the device view:
- The version of software running on your development board (Version 417 in this case)
- The number of ARM CPU “systicks” occurring during one iteration of the main sensor fusion loop. Take this number, divide by the CPU clock rate, and you have the number of seconds required for each iteration through the loop. For the case above, 514,860/48MHz = 10.7ms. The number is computed in real time, and changes depending upon which algorithm you are running.
- The board type you are using (a lot of the boards look alike)
I should mention that all of the above are also shown in the “Device” tab in the Windows-based toolbox.
Figure 6: Xtrinsic Sensor Fusion Toolbox for Windows – Canvas View
Figure 6 shows the new “Canvas View” which was just added to the Android version of the Toolbox. It demonstrates how we could use the sensor fusion quaternion output to create a wireless pointer. The accel/gyro and 9-axis algorithms work best. The 3-axis options are pretty much worthless due to basic limitations of using just those sensors, although I will note that gyro-based air mice are possible, just not with this particular algorithm. Check/UnCheck the “Absolute” checkbox on the Fusion Settings Bar to switch between the “absolute” and “relative” versions of the wireless pointer algorithm. And be sure to read the “Canvas” chapter of the in-app documentation to get full details about how it works.
Our goal with the new http://www.freescale.com/sensorfusion page is to give you everything you need to get started quickly. Relevant hardware, libraries, tools, training materials and support options have been brought together in one place. If you already have the CodeWarrior for Kinetis MCUs IDE installed on your Windows machine, and have your development boards on hand, you can be up and running ten minutes from the time you land on the page. And as always, if you have suggestions or ideas for how to improve things, just drop me a line.