Sensor Fusion in a State of Flux as Companies Fuse Together

Guest post by Tony Massimini, Semico Research

There has been a great deal of activity among companies within the sensor fusion ecosystem. Mergers and acquisitions are changing the competitive landscape.

As a quick background, sensor fusion is the technology of combining data from multiple sensors and deriving intelligence from that data.  It is the foundation for motion tracking, navigation, context awareness, location based services, augmented reality and more.  It is the basis for future innovative applications.  The brains behind sensor fusion is in the algorithms.  This is usually embedded in a 32-bit microcontroller core or similarly powerful processing device, known as a sensor hub.

In May 2014, Fairchild announced the acquisition of Xsens the Dutch company known for motion tracking software.  Xsens has been doing motion tracking for film and other such applications.  It has modules with low cost consumer grade inertial motion MEMS sensors from STMicroelectronics.  At the time of the acquisition, Fairchild also announced that it would be bringing MEMS sensors to market soon as well.

In the last couple of weeks there has been a flurry of activity amongst the sensor fusion ecosystem players.  On June 24, 2014 Audience announced it would acquire Sensor Platforms.  The buyout is expected to be complete by mid-July 2014.  On July 7, 2014 InvenSense announced it was acquiring two companies, Movea and Trusted Positioning, Inc.  These acquisitions are to be finalized by the end of September 2014.

Audience is a relatively small company with about $150 million in annual sales.  It has been known mainly for voice and sound processing.  It was designed into the Apple iPhone 4S but did not maintain its design win in Apple’s next generation iPhone.  Sensor Platforms is a third-party sensor fusion algorithm developer.  The company has been focused on context awareness and indoor navigation.  Sensor Platforms licenses its algorithm to OEMs and to sensor hub chip vendors.  Audience had been working with Sensor Platforms for its always-on sensor fusion for voice and motion and decided to acquire the company for $41 million.  The name of Sensor Platforms will no longer be used following completion of the acquisition.

Audience also has a motion processor, MQ100, which will launch late 2014.  This is a DSP based device which will function as a sensor hub controller.  The sensor fusion algorithm developed with Sensor Platforms will be embedded in MQ100.

InvenSense has been a fast growing company delivering inertial motion sensors (gyroscope, accelerometer magnetometer) in multi-chip packages with its digital motion processor.  The company has been focused on motion tracking and pedestrian navigation for mobile devices and wearables.  At the end of 2013 it acquired the microphone business of Analog Devices.  Movea, like Sensor Platforms, is a sensor fusion algorithm developer which licenses the algorithm to OEMs and sensor hub controller vendors.  Movea provides of software for ultra-low power location, activity tracking and context sensing.  Its IP is found in consumer mobile (smartphones and tablets), TV interaction and wearable sports & fitness applications.  Movea’s context analysis uses both motion and audio sensors.

Trusted Positioning Inc. (TPI) is a software company providing indoor/outdoor positioning solution for mobile and wearable devices.  This has been a key area of development for InvenSense.  TPI’s platform also provides inertial navigation software solutions for in-vehicle navigation, personnel tracking, and machine guidance and control.

Semico Spin

Semico has stated several times that the value for Sensor Fusion is in the algorithm.  The average selling prices for the sensors are falling rapidly.  The hardware is becoming more of a commodity.

These acquisitions show that the chip vendors want to add value to their products.

Sensor fusion is expanding to include more data sources.  Sound for always-on context awareness and to provide spatial awareness is the next step.  Therefore, the microphones are key elements.  More sensors for biological and environmental data will come into use.  InvenSense, Audience and Fairchild want to integrate more IP in their respective technologies.

Sensor fusion is moving beyond smartphones and tablets into wearables and other products.  At a MEMS Industry Group panel at the Consumer Electronics Show in Las Vegas, Jan. 2014, Semico stated that wearables with 9-axis or more and sensor fusion features would be a high growth market.  It was at CES 2014 that InvenSense and Movea in separate announcements revealed they had reference platforms for wearables with 9+ axis sensor fusion.  Clearly both companies were on the same page at that time.

Impact on the Market

Less than a year ago, there were four companies licensing sensor fusion algorithms: PNI Sensor, Sensor Platforms, Movea and Hillcrest Labs.

In June 2013 PNI announced an ASIC, Sentral, which embeds its algorithm.  It still licenses its algorithm but on a selective and strategic basis.

Following these recent acquisitions, it is assumed that licensing commitments involving Sensor Platforms and Movea will be honored.  It seems to Semico that it is unlikely future licenses will be extended.  Both InvenSense and Audience will have sensor hub controllers in competition with other companies.  Thus, the only remaining independent third party sensor fusion developer without a competing chip is Hillcrest Labs.

Hillcrest Labs has a wide customer base.  In March 2014, it was announced that Hillcrest Labs and Bosch Sensortec were collaborating on a sensor hub solution for head mounted displays and wearable devices.  Following the recent announcements concerning their competitors, Hellcrest Labs has seen an uptick in interest for their products.

OEMS want options and do not want to be locked into one supplier.  Semico believes that one option sensor hub vendors and OEMs will pursue is to develop their own in-house sensor fusion algorithms.  However, this expertise is in short supply.  This is a specialized area of study.  Even with a strong team it could take one to two years to get up to speed.

The market for sensor fusion and sensor hub controllers is growing rapidly.  Smartphones are currently the largest market, but the technology is being leveraged into other areas, especially wearable devices for sport, health, fitness and medical.

Semico forecasts that the market for sensor hub controllers will reach 2.5 billion units by 2018, CAGR (’13 to ’18) 27.4%.  The wearable market will see CAGR of 114% reaching over 300 million units for devices with 9-axes or more.

MEMS revenues due to sensor fusion will grow to $7.7 billion by 2018 CAGR (’13 to ’18) of 20.3%.  In 2013 MEMS revenues in sensor fusion applications account for 23.6% of the total MEMS market.  By 2018, this will grow to 34% of the MEMS market.

For more detailed information on the sensor fusion market and the companies mentioned in this blog, please contact Rick Volgelei at rickv@semico.com.  

Hilton Head 2014 Wrapup – 30 years of MEMS!

Guest post by Eric Levy-Myers

This was the 30th anniversary of the Hilton Head conference and the mood was one of amazement of how far the industry has come since the first meeting 30 years ago. The conference chairman noted that solid state sensors have taken over the world if only evidenced by the fact there are more smart phones than tooth brushes in the world. This set the stage for an underlying question of the conference: what happens to MEMS in the next 30 years? The Rump Session on Wednesday evening addressed the topic with special speakers.

The first day of the conference focused on Bio MEMS. In the first plenary session, Dr. Oliver Paul of the University of Freiburg spoke of directly linking to the brain to both sense and stimulate neurons. He noted with some humor that the brain has 10 to the 11th neurons. So given the number of neurons we can sense today, and a Moore’s law for probe sensors that doubles every 7 years, the curve says that we will be able to sense all the brains neurons by 2240.

The meat of his talk focused on four areas of invasive brain systems:

  1. Epicardial Grids that lay on the surface of the brain to control external machines such as robots for paraplegics to feed themselves.
  2. Array implants to sample many points in the brain.
  3. Deep Brain Probes to regulate diseases such as Parkinson’s.
  4. New Optical Stimulation technology that stimulate neurons based on wavelengths tuned to specific neuron types.

There are many challenges to getting these technologies into broad use, not least of which is the brain’s immune system attacks the probes rendering them useless in weeks or months. The research papers presented in the later sessions detailed how researchers are trying to overcome these and other challenges with Biomedical and Cellular Devices, and Bioassays.

Day two at the conference focused on the physical aspects of MEMS devices and fabrication. The plenary speaker Dr. Robert Carpick of the University of Pennsylvania introduced us to a term that most people had not heard of, “Tribology”.  As he explained; “We did not like the term ‘science and engineering of interacting surfaces in relative motion’ so we grabbed Greek words to make the word Tribology.” His thesis was that at the macro level, scientists and engineers understand how surfaces in contact interact.  They have methods to reduce the effects of this contact such as friction and material exchange. But at the MEMS level much less is known.  This is one reason MEMS devices avoid contact points and why MEMS manufacturers can be so frustrated by stiction or stickiness. Dr. Carpick explained several areas that hold promise to allow MEMS parts to touch and rub indefinitely without ill effect. One method was to have a sealed system fill with alcohol. The research papers in the technical session extended the topics to include Materials and Surfaces, Fabrication & Materials and Magnetic Transducers.

The Day three plenary speaker Dr. Kurt Petersen brought us his vast experience in successful entrepreneurship with MEMS companies and shared his lessons of what makes for successful startups. He set a very optimistic tone for the future of MEMS, one that is bright but not a given.   So he offered many juicy tidbits for anyone who wants to successfully start, run and exit a business. There were too many to repeat here, but these stood out:

  • Have a great team that is persistent and dedicated to the company’s vision.
  • Get your product into production fast. This fit well with the advice from the Sunday Workshop session where “fail fast so you can learn and adapt fast” was a theme.
  • Know your market inside and out because the investors will.
  • Inventing is great, but designing for manufacturing and efficient production is probably more important. You cannot make money if you cannot make and test it economically.

The afternoon research papers covered the latest research in High Q Resonators and Resonant Systems, promising, as did all the papers, much more MEMS innovation to come in the future.

The Rump Session highlighted the Sigma Group, a collection of SciFi writers that are distinguished by their previous careers as scientists, engineers and program managers. They also use these skills to advise the government about future issues of concern and opportunity. They spent the week talking to participants in the conference to gather data, so in the session they offered many insights.  After beers they shared even more interesting ideas and interactions with the audience!

Since the chair of the panel and the writers noted that SciFi correctly predicted most aspects of the internet and sensors we use today, we can perhaps assume today’s wild predictions are not as wild as we think. Perhaps the most interesting idea had to do with the brain systems discussed on day one.  Why not implant something that grows fiber into the nose that will enter the brain with millions of micro -strands that can act as probes. Overall, attendees and presenters at Hilton Head 2014 expressed much optimism that the MEMS industry will continue to grow into more unexpected parts of our lives as we move to a world of trillions of sensors and the internet of things.

The next conference is in Anchorage Alaska in June 2015. Hope to see you all there.

One-stop-fusion-shopping at freescale.com/sensorfusion

Guest post by Mike Stanley, Systems Engineer at Freescale

Back in February, I wrote an article describing the Xtrinsic sensor fusion library for Kinetis MCUs. Over the intervening months, we’ve made a number of improvements:

  • Demo and Development versions of the kit have been consolidated into a single installer that covers all options.
  • The limited “Evaluation” version has been removed. In its place, we offer free board-locked licenses tied to your Freedom development board. Licenses are generated automatically during the installation procedure.  You now have access to the full development version with your first download.
  • We’ve added support for two new base boards, bringing the total to four: FRDM-KL25ZFRDM-KL26ZFRDM-K20D50M and FRDM-K64F.
  • We’ve updated the Xtrinsic Sensor Fusion Toolbox for Android to support the new boards.  We also added several neat new features I’ll mention below.
  • We’ve published our Xtrinsic Sensor Fusion Toolbox for Windows.  It’s not a clone of the Android variant, although there are some common features.  It goes will beyond that tool, offering a deeper understanding into some of the underlying calibration and fusion routines.
  • We’ve reworked the Android app landing page into a one-stop-shop for all things related to sensor fusion.  Visit http://www.freescale.com/sensorfusion to find convenient links for everything you’ll need to get your project started.  That includes all of the above, plus training materials, and a link to the Freescale Software Services group.  They can provide quotes for production software licenses and custom consulting work.

Figure 1 will look familiar to readers who have experimented with the Xtrinsic Sensor Fusion Toolbox for Android. The rotating PCB display shown here was inspired by that app.  The Windows version gives you some really nice additions.  First and foremost are support (via UART/USB wired connections) for the FRDM-FXS-9AXIS and FRDM-FXS-MULTI sensor boards.  Unlike the FRDM-FXS-MULTI-B board, these do not have Bluetooth modules, and cannot be used with the Android version of the toolbox.  That’s no problem for the Windows variant, which uses the virtual serial port feature of the OpenSDA interface to talk with the boards.  Simply plug your boards into your Windows machine, start the application and click the “Auto Detect” button you see in the upper right of the figure.  The application will cycle through your PCs serial ports until it finds one connected to a Freedom board and running the template app from the Xtrinsic Sensor Fusion Library for Kinetis MCUs.  And if you have a Bluetooth enabled PC, pair it to your FRDM-FXS-MULTI-B and run wirelessly.  The communications interface is the same from the perspective of the Windows GUI.

pc_app_device_view.png

Figure 1: Xtrinsic Sensor Fusion Toolbox for Windows – Device View

Just like the Android version, you can select from a variety of fusion algorithms.  Also shown are the version of embedded firmware running on your Freedom board, along with the type of board (assuming you have debug packets enabled on the board).

 

pc_app_sensors_view.png

Figure 2: Xtrinsic Sensor Fusion Toolbox for Windows – Sensors View

Figure 2 shows you the “Sensors” view of the application.  Here you have current values and value versus time plots for raw accelerometer and gyro readings, plus calibrated magnetometer.

pc_app_dynamics_view.png

Figure 3: Xtrinsic Sensor Fusion Toolbox for Windows – Dynamics View

The “Dynamics” view, shown in Figure 4, lets you look at some of the virtual sensor outputs from the sensor fusion library.  These include orientation in roll/pitch/compass heading form, angular velocity and acceleration.  You might wonder what the difference is between “angular velocity” and the gyro readings on the “Sensors” page.  If your algorithm selection supports a physical gyro, then the values in Figure 3 have had gyro offsets subtracted from them.  If your algorithm does not include gyro support, then the angular velocity included here is the result of a “virtual gyro calculation” (see “Building a virtual gyro“).

The accelerometer reading on the “Sensors” page included the effects of both gravity and linear acceleration.  The “Acceleration” item on the “Dynamics” page has had the effects of gravity removed, so it represents only the linear acceleration of your board.

 

pc_app_magnetics_view.png

Figure 4: Xtrinsic Sensor Fusion Toolbox for Windows – Magnetics View

I think Figure 4 shows the neatest feature introduced in the toolbox.  Those of you who have seen prior generations of Freescale magnetometer demos will recognize computed hard and soft iron correction coefficients on the left, along with our “magnetic calibration meter”.  What’s new is the 3D-to-2D projection shown on the right.  These are the measured data points selected by the magnetic calibration library for use in determining the correction coefficients.  Ideally, the figure should be circular in shape, be centered at 0,0 and have a radius equal to the magnitude of the earth magnetic field.  Nearby magnets, fixed spatially relative to the sensor, will shift the center to some non-zero value.  Ferrous materials, fixed spatially relative to the sensor, will distort the circle into an ellipsoid, and possibly rotate it.   If sources of interference are not fixed relative to the sensor, you’ll still see distortion, but it will not behave in as predictable a fashion, and isn’t as easily corrected.   It’s educational to bring your board near sources of magnetic interference, and watch how the constellation will distort, then self-repair over time.

android_app_device_view.png

Figure 5: Xtrinsic Sensor Fusion Toolbox for Android – Device View

Figures 5 and 6 are screen dumps from the latest version of the Xtrinsic Sensor Fusion Toolbox for Android.  If you enable display of debug packet information in the preferences screen, you’ll get additional information displayed on the device view:

  • The version of software running on your development board (Version 417 in this case)
  • The number of ARM CPU “systicks” occurring during one iteration of the main sensor fusion loop.  Take this number, divide by the CPU clock rate, and you have the number of seconds required for each iteration through the loop.  For the case above, 514,860/48MHz = 10.7ms.  The number is computed in real time, and changes depending upon which algorithm you are running.
  • The board type you are using (a lot of the boards look alike)

I should mention that all of the above are also shown in the “Device” tab in the Windows-based toolbox.

android_app_canvas_view.png

Figure 6: Xtrinsic Sensor Fusion Toolbox for Windows – Canvas View

 

Figure 6 shows the new “Canvas View” which was just added to the Android version of the Toolbox.  It demonstrates how we could use the sensor fusion quaternion output to create a wireless pointer.  The accel/gyro and 9-axis algorithms work best.  The 3-axis options are pretty much worthless due to basic limitations of using just those sensors, although I will note that gyro-based air mice are possible, just not with this particular algorithm. Check/UnCheck the “Absolute” checkbox on the Fusion Settings Bar to switch between the “absolute” and “relative” versions of the wireless pointer algorithm.  And be sure to read the “Canvas” chapter of the in-app documentation to get full details about how it works.

Our goal with the new http://www.freescale.com/sensorfusion page is to give you everything you need to get started quickly.  Relevant hardware, libraries, tools, training materials and support options have been brought together in one place.  If you already have the CodeWarrior for Kinetis MCUs IDE installed on your Windows machine, and have your development boards on hand, you can be up and running ten minutes from the time you land on the page.  And as always, if you have suggestions or ideas for how to improve things, just drop me a line.

Report from Hilton Head 2014 Solid State Sensors, Actuators and Microsystems Conference

Image

By Eric Levy-Meyers on behalf of MEMS Industry Group

Greetings from the Hilton Head 2014 Solid State Sensors, Actuators and Microsystems Conference. On June 8, I attended the optional Sunday Workshop: Frontiers of Characterization and Metrology for Micro- and Nano-Systems organized by Michael Gaitan of National Institute for Standards and Technology (NIST) and sponsored by MEMS Industry Group (MIG), who gathered a great group of speakers to address this topic.

NIST organized this session for the second time at Hilton Head. This program is designed to facilitate the process of improving the standardization of testing and standards that started at MIG’s Member-to-Member (M2M) Forum in 2010 and led to the NIST/MIG report “MEMS Testing Standards: A Path to Continued Innovation.” It was apparent from the interactions that the industry has realized that cooperation in the precompetitive space of testing and characterization must increase to allow the industry to grow and innovate.

At the Micro-Nano workshop at Hilton Head there were eight fascinating talks followed by a very lively panel discussion about the challenges and issues in MEMS characterization and testing. A few of the juicy conclusions are below. For all the details, be sure to subscribe to MEMSBlog so you can access later this week to get the whole report, which should be ready later this summer (free and available for anyone to download).

Key issues discussed include: 

  • Manufactures and users always find ways to use, and sometimes damage, MEMS devices in ways no designer or tester could predict. An example was a “blow out the birthday cake candles” Smart Phone application that had users blew into the microphone. Well, it took a while to link the app to damaged microphones (whoops).
  • Testers are not the bad guys – but they can deliver results people do not like to hear. But the faster people listen, the faster the devices can be fixed. In fact, “fail fast” can be a good approach to getting the best product out the door the fastest.
  • Since testing of MEMS devices leads to discovering novel failure modes, testing, failure analysis, manufacturing and design teams should be in close and continuous contact, especially in high volume systems.
  • The fact that customers always want devices that have more features and are faster, smaller and cheaper, leads to huge pressure on testing which never seems to have enough time to get ready for production.
  • New device types often require custom testing equipment and procedures, but over time, as these devices become more common, testing can be standardized.
  • It is easy to rely too much on tools instead of engineering intuition. There is no substitute for real world experience.

Calling all Innovators to Help Save Our Oceans

Image

Guest Blog by Matt Huelsenbeck, Team Relations Manager, XPRIZE

A major problem facing the ocean is that air pollution is also ocean pollution. The surface ocean layer has become on average 30% more acidic since the Industrial Revolution due to the absorption of carbon dioxide from the atmosphere. These changes in ocean chemistry, dubbed ocean acidification, threaten many forms of marine life, fisheries, and other vital ocean services. But due to a lack of good tools to measure pH, there is little to no information on how ocean pH is changing on a regional level, or in places like the deep sea. We can’t tackle a problem we know so little about.

Therefore, the XPRIZE Foundation, the leading non-profit that’s solving the world’s Grand Challenges through large-scale incentivized prize competitions, is collaborating with ocean philanthropist Wendy Schmidt to offer $2 million dollars in prizes to address ocean acidification through the development of breakthrough pH sensor technology. The winning pH sensor(s) of the Wendy Schmidt Ocean Health XPRIZE will be radically more accurate, durable, and affordable. This is where you come in.

We are looking for teams of innovators to compete in this once-in-a lifetime competition to help tackle the issue of ocean acidification! Would you or someone you know, be interested in forming or joining a team? Skills as diverse as electrical engineering, materials science, data science, nanotechnology and chemistry could be part of the winning team. Registration is open, but closing soon, and we encourage you to fill out the Intent to Compete form today. By submitting your intent to compete form, you can build or join a team made up of innovators like yourself.

There are two prize purses available (teams may compete for, and win, both purses):

$1,000,000 Accuracy award – Performance focused (First Place: $750,000, Second Place: $250,000): To the teams that navigate the entire competition to produce the most accurate, stable and precise pH sensors under a variety of tests.

$1,000,000 Affordability award – Cost and Use focused (First Place: $750,000, Second Place: $250,000): To the teams that produce the least expensive, easy-to-use, accurate, stable, and precise pH sensors under a variety of tests. 
 
This is your chance to apply your skills to help improve our understanding of one of the oceans greatest threats, ocean acidification, and win up to $2 million dollars in the process! We hope to see you compete.

MIG Conference Japan Wrap-Up

By Karen Lightman, Executive Director

I am finally over the jet lag and able to share my thoughts from MEMS Industry Group (MIG) Conference Japan, MIG’s inaugural conference in Asia that was held on April 24. But first let me quickly express my happiness to have returned to Japan after a three+ year hiatus. (My last visit was before the tsunami/earthquake.) I ate sushi every day, drank sake, partook of a Japanese bath and consumed green tea (in very large quantities). What a great place to visit.

A few months ago I invited you to spend a week with me in Japan, as there were several partner events that dovetailed with our MIG conference, including the NanoMicro Biz ROBOTECH and MEMS Engineer Forum. On April 23 I traveled to Yokohama to give a keynote at NanoMicro Biz’s 20th annual International Micromachine/Nanotech Symposium.

The conference had been relocated to Yokohama, an impressive “city by the bay” that is only a 30-minute train ride from Tokyo. And while the exhibition site was smaller than previous years, the Symposium was still impressive, and my presentation on “MEMS and Sensor Trends, Paving the Way for the Internet of Things” was well received by a diverse and international audience. I also had the opportunity to represent MIG in our booth and sneak in a few MIG-branded chocolates created for us by the conference organizers (yum) as well as connect with several MIG members and partners in attendance.  

Then it was back to Tokyo to kick off MIG Conference Japan with MIG Events and Program Manager Chivonne Hyppolite. Simply put, the conference exceeded expectations in terms of quality/number of attendees as well as content. I am grateful for the guidance and support MIG received from Mr. Susumu Kaminaga of SKG Partners and Mr. Yoshio Sekiguchi of OMRON; without them, there is no way that the conference would have happened let alone be successful.

What excited me the most about MIG Conference Japan was the originality of the content provided by our keynotes and featured speakers. (Here is the agenda.) The focus of the conference was on navigating the challenges of the global MEMS supply chain. Several of the speakers gave their no-holds-barred view of these challenges, including the keynote from Sony Communications, Takeshi Ito, Chief Technology Officer, Head of Technology, Sony Mobile Communications. Mr. Ito’s shared his thoughts on the future of MEMS and sensors (and in particular, alternative uses for acoustic MEMS), which I found very interesting, and I truly appreciated his end-user/OEM perspective.  I also thoroughly enjoyed the presentation by Leopold Beer, regional president Asia Pacific, BOSCH Sensortec, who explored the criticality of balancing higher integration and rapid product cycles with the need to support multiple applications.

Honestly all the presentations at MIG Conference Japan were impressive, and I am not going to do a play-by-play here for you. (Sorry folks.) But what I will do is urge you to consider attending our next big event in Asia: MIG Conference Shanghai, which will be held September 11-12, 2014 in Shanghai in in partnership with Shanghai Institute of Microsystem and Information Technology (SIMIT) and Shanghai Industrial µTechnology Research Institute (SITRI).

Our Shanghai event will be more focused on the theme of the Internet of Things/Services/Everything as well as the challenges of a global MEMS supply chain. Please join me there to further explore the future of MEMS and sensors. For more information, you can visit our website.  

Bulbs Need Intelligent Lighting Systems

Guest blog from Semico Research

There are over 3,000 companies making LED bulbs.  Regionally, there are countries like China that have 5-year plans which foster the development of leading SSL manufacturing firms while pushing LED lighting on the market.  How many light bulbs do you have in your house?  How many are LED?  How many lights at your workplace? On the streets and freeways?

If you thought the sensor market was large before, with smartphones and fitness trackers, imagine all the sensors and controls that could go into lighting sources and outlets, with the intent of monitoring behavior and finding trends in order to predict how and where our lighting should be installed.

With MEMS, the entire smart home may have sensors.  For example, your walls may have accelerometers built in to help predict and recover from earthquakes.  Bulbs may make use of a MEMS microphone to help determine lighting needs.  As the price of MEMS sensors continues to decline, manufactures should turn their eye to this market.

For example, imagine having the majority of your ceiling be comprised of multiple types of lights, all of which can automatically be adjusted depending on your behavior.  This is important for the home theater system, where in order to play a movie, the screen must be lowered, the system turned on, the curtains closed, etc.  But, with smart lighting controls, the mere act of sitting down on the couch at a particular time of the day could trigger all those other actions automatically with the lights adjusted accordingly.  How can MEMS contribute?

Perhaps even more useful, intelligent lighting can sense commands from other lighting sources without the use of a wired connection.  This effectively creates a 3D map of your environment with the lighting system at the head of it.  No more automatic lights that rely on gestures in order to stay on.  The lighting system of the future will know if there are living creatures in the room or not.  This isn’t far out in the future either, we’re looking at this technology now, and at the point where manufacturing and deployment must work together.

According to the U.S. Department of Energy, 86% of all lighting in residential markets currently have no control system, and 70% of all commercial lighting have no controls.  The market penetration rate is so small, and the potential so large, this is a market you should be keeping an eye on.  That is why Semico is hosting a Smart Lighting Event on April 23rd in Santa Clara to discuss deployment trends and what opportunities and barriers to entry we have to look forward to.  Semico’s CTO, Tony Massimini will be discussing, in particular, how MEMS manufacturers can build a niche within the Smart Lighting market.  Join us and register here.