EDA

OpenText offers secure collaboration environment

At the 52nd Design Automation Conference in San Francisco, we talked to Rod Simon of OpenText about their collaboration platform, Exceed VA Turbo X that was introduced to the EDA industry at the conference.  With a web-based interface, Exceed VA TurboX is a hybrid solution intended to improve users productivity by enhancing collaboration from any location, securely. Exceed VA TurboX is designed for the enterprise data center so administrators to easily manage and monitor access to sensitive applications and data. Here's the interview:

Yotta Data Sciences may have the answer to lowering chip design cost

Yesterday we ran a report about whether chip design could be effectively reduced to the point of profitability for innovative designs.  Conventional wisdom said that possibility will not come around for at least 10 years.  But in our year-long investigation we stumbled across a very quiet company, Yotta Data Sciences, and its founder and CEO Tom Grebinski, that might have a solution that would speed the process up within a couple of years. Grebinski has put his thumbprint on the semiconductor industry for a couple of decades.  He dealt with how ICs are physically composed by pioneering atomic layer deposition technology in the 1980s, he moved to developing a way to handle yottabytes of data as the author of the the OASIS integrated circuit layout format standard of SEMI.  Now he’s taking on how that data can be managed, distributed and protected efficiently and effectively.

 See the full story on ChipDesignMag.com

Quest for the 10K Chip: Final Report

Part 1: Is this trip necessary? By Lou Covey New Tech Press First appeared in Print Version of Chip Design Magazine (June 2013)

At the Design Automation Conference in June 2012, Verilab’s JL Gray posed a perfectly unreasonable question: “Can you build a chip to verified prototype for less than $10,000 that an investor would want to back as a product?”  Over the past few months a variety of experts on the subject came to consensus and, as it turns out, the answer is yes and no. Yes, you can build one to prototype, but no, an investor would not find the product interesting. How about building a chip interesting to an investor for under $25 million, which Gary Smith, chief analyst for GarySmithEDA, calls the tipping point for investors.  Again, the consensus was in agreement: Yes, but not for at least 10 years... maybe 20.

DAC had a significant contingent of companies pushing FPGA prototyping tools and several pundits claiming that using FPGAs immediately reduce the cost of development.  In an interview a few months later, Smith concurred that people are making $10K chips with FPGAs and free tools “all the time,” but those tools did more good in developing chips at the 90nm and above node and nothing to reduce the cost of bringing a chip through manufacturing to market, which is the bulk of the cost, especially at the more advanced nodes. (link to Smith videos) That’s the rub, but progress is being made.

Reducing the overall cost of chips, from development to manufacturing, has been the goal of everyone from the smallest EDA company to the largest foundry.  But, every advance in process node means investment in even more expensive and potentially unproven technology.  Almost every EDA product release promises to reduce cost by $10 million, depending on the application, theoretically.

As early as 2009 the total overall cost for bringing an advanced chip to market exceeded $50 million.  Smith states that the cost has been reduced to as little as $28 million.  Much of that cost reduction, he identified as coming from three directions: effective and low-cost tool packages, the advancement of ESL tools raising the level of abstraction, and the return of independent design services reducing NRE costs.

It cost an average of $10 million to acquire enough EDA tools to do a decent job on a new chip 10 years ago. Smith said you can achieve success with packages below $20,000 now for chips at 90nm and up to 2 million gates.  Below that node, and above that gate count it gets dicey.

Raising the level of abstraction to ESL has been touted for years as the key to reducing overall cost, and the latest available tools, according to Smith, are just coming onto the market.  According to Karl Kaiser, vice president of engineering at Esencia, “ESL is the key to reducing engineering costs, which comprise the major financial barrier in the development of any truly innovative chip design.”

Kaiser pointed out that small companies, like Adapteva are using innovative tools as well as innovative ways of raising funds to develop new products (Maxfield article) to keep costs low.  Adapteva’s first chip to market cost less than $2 million.  But those cases are the exception rather than the rule.

Design services are rising to the forefront again to the level that Smith calls the niche, the “wild west” of the EDA industry with new companies coming to the forefront not only in Asia but in the US as well.

Josh Lee, CEO of Uniquify, a design service company, echoed Smith saying a design services company can not only simplify but standardize the design process, objectively evaluate what the right tool will be and compensate faster for specific tool weaknesses.  This eliminates a significant amount of NRE cost especially for an OEM that doesn’t specialize in chip design.  Several companies are divesting themselves of chip design departments and outsourcing the services to design firms.  Earlier this year TSMC added Uniquify as a design partner and LG Electronics selected the company for memory IP design.

A common thread, however, through all the comments and predictions on the issue of lowering the cost of semiconductors is NRE.  That boils down to the time it takes for an engineer or a team of engineers to coordinate and produce a design.  Tools can go so far. Outsourcing can go so far. Methodology, IP and design platforms can go so far.  None of them have even come close to dealing with one major problem: The amount of data engineers have to deal with.

There has been some discussion regarding the use of the cloud to allow EDA customers to use tools “as needed” rather than buying seats; and to use the cloud to coordinate disparate design groups worldwide.  Almost unanimously there is skepticism. “The cloud is an important aspect of IT,” said Smith, “but it can’t handle the high end computing problems.”  Lee concurred, “The cloud can be useful in EDA, but alone it will not help solve the problems of design.”

Or not... Stay tuned.

(Part 2: Enter Yotta Data Sciences, will appear later this week.  This article is unsubsidized and was produced in partnership with Extension Media.)

 

IC design is focused on wrong problems

By Lou Covey Editorial Director

The semiconductor design industry is focused on the wrong problems, according to Brad Brech, distinguished engineer at IBM at the ISQED symposium in Santa Clara yesterday.  Stating his position in the most diplomatic terms, that was the upshot of his talk on “sustaining Innovation for Smarter Computing in Data Centers.”

While chip design is focused on increasing speed and computing power, Brech said efficiency and cost control are the biggest concerns of the end customer now.

Brech said the chip industry is still focused on incremental increases in performance but the improvements we see in semiconductors, 200ms to 300ms, is imperceptible to the end user.

“We need to bring a different kind of value to the customer,” he said. “In the 1970s and 80s the airline industry was trying to move people faster between destinations and almost went bankrupt doing it,” he stated.  “There hasn’t been a faster plane put into service in over 20 years, but they have found ways to move more people. We need to do the same thing in IT regarding data.”

Brech stated that 70 percent of the IT budget is devoted to operations and maintenance of data centers while complexity of managing massive amounts of data grows steadily by orders of magnitudes.  He said 22 billion devices are connected to the internet now, downloading and adding data every second.  Applications such as cognitive computing and big data analytics are compounding the problem.

The solution, he stated, is in alternative designs that are cloud ready, data ready and security ready.  Those are the applications technology needs to focus on.

Quest for the $10K chip: Services are a mature niche

Continuing on our series on reducing the cost of semiconductor design and manufacturing, we interviewed Josh Lee, CEO of Uniquify, on the design services niche.  Lee took mild exception on Gary's Smith's definition of the niche becoming a "wild west" of competition saying the combination of IP integration with services has created a mature industry that is competitive but also crucial to creating new products in absence of venture capital involvement.

Quest for the $10K Chip: The wild west of design services

In the fourth installment of our Quest for the 10K chip series, We return to the last part of the interview with EDA Analyst Gary Smith to discuss the reemergence of design services and their role in reducing the cost of chip design. Smith believes the industry is headed toward a new generation of ASIC houses, and says the advantage of ESL is that it allows designs to be handed off at any number of various sign off points. Smith believes the design services segment is in for big growth in the coming years.

Esencia weighs in on the $10K Chip

ESL startup Esencia has been making noise about the changing of the guard in IC design as the industry moves from RTL to ESL. In this third part of our series on the Quest the the 10K Chip, Karl Kaiser, VP of engineering for Esencia, talks about Gary Smith's view of the cost of IC design and where ESL can continually lower that cost.

Gary Smith, ESL and the Quest for the $10K Chip

In part two of our Quest for the 10K chip series, EDA Analyst Gary Smith discusses the need for ESL, how it will reduce the cost of designs, and why the time is right for ESL to be implemented now. Smith says in the past, we’ve leveraged IP reuse as a way to solve our design challenges, but we are now seeing 100 block designs with 100 million gate counts; IP reuse alone will not solve these challenges. In this video Smith discusses what he sees coming next to solve these challenges.

See Part 1 here.

 

Gary Smith considers the quest for the $10K chip

At the Design Automation Conference in June 2012, tech blogger JL Gray posed a question: Can you build a chip to prototype, that would interest a potential investor to take it all the way to production, for $10,000?  The question launched a 6-month investigation by New Tech Press to get an answer.  Dozens of entrepreneurs, analysts, investors and engineers accepted invitations to discuss the subject.  While most wished to remain anonymous a few agreed to go as far as to be captured on video. We will be rolling out the series for the next few weeks with links to several articles in other publications.  The video platform we are using is from meBox! Media that will allow viewers to interact with and share the content on a broad level.  We encourage your input in comments on this site, however if you wish to remain anonymous, you can send private comments directly to us by clicking on the email button on each video.  Hover your cursor over the screen to expose the engagement buttons. The series we start today is the best of those meetings, anchored with a three-part interview with Gary Smith, chief analyst for GarySmithEDA.  Gary starts with the basics of the question, starting from the use of low-cost FPGAs and free tools, through the actual costs of manufacturing and ways to keep tool costs low.

Gary Smith on the Cost of IC design

France's Menta introduces the "chipless" FPGA

In our continuing search for the "$10,000 Chip" we ran across yet another French company that brings us closer to our goal. Menta S.A.S. based in Montpellier in the south of France has created what might be called a "Chipless" FPGA, a programmable IP core that allows designers to target any design platform, be it a traditional FPGA or an SOC...but you don't have to buy the chip to get to the end of the design, implementation can be decided at the end of the process.

The soft IP core permits a standard RTL flow with design tools.  Designers can modify the design according to their needs on the fly.  A hard macro version of the soft IP can be a drop-in solution for rapid time to market.

But for our purposes, the elimination of the hardware early in the process means you can get to prototype faster and with less cost.  New Tech Press dropped in on the Menta booth at DAC 2012 and chatted with the CEO, Laurent Rouget.

(New Tech press is debuting a new video tool with this report from our new partner Me!Box Media.  The technology allows you to interact directly with the video, including sending comments back to us.  Let us know what you think)

Esencia Technologies tool boosts programmable logic design

This year, as almost every year Gary Smith of Gary Smith EDA states that electronic system level (ESL) design is now here. Reality has not always agreed but there are often many signs that say Gary might be on to something. One of them was a tiny startup on the DAC floor called Esencia technologies with an interesting tool kit called Escala. Among their claims was that Escala could make it possible to replace ASICs with low-end FPGAs in low-volume production systems. Lou Covey from Footwasher Media checked them out in this video interview.

 

Univa filling Sun gap in EDA industry

Sun Microsystems had been an active partner and participant in the EDA world until Oracle completed its acquisition in 2010.  Suddenly the Sun logo disappeared quite conspicuously from the EDA exhibitions in 2010 and 2011 where Sun had previously been prominent and ubiquitous. A little of that came back in 2012 with Univa taking their first active presence at the Design Automation Conference. In 2007 Univa helped Sun create the Sun Grid Engine, a job resource and management tool, developed the software to meet Sun’s HPC go-to-market requirements and became a reseller of the engine.  In 2010, however, Oracle decided the grid engine division was not as profitable as they would like and dropped it altogether, giving Univa an instant gift of thousands of dedicated users and a rapidly growing service business. Nearly 30% of the company’s customers are in the EDA / Semiconductor space.

In January 2011, Univa hired the Sun Grid Engine team and redesigned their business model to continue to support and update the Grid Engine. over the following year Univa delivered more code to their new community than any third party, patching holes and adding new functionality, something that had not been done for close to 2 years.

In March of this year, Univa released the results of a Technical Computing User Survey, that showed 70 percent of the respondents expected an increase in use of high performance computing this year and 75 percent expected an increase in 2013.  No one indicated a decline in use.

The information of the survey was instrumental in Univa's decision to begin making their presence in the EDA industry obvious and the plan, according to Gary Tyreman, President and CEO is to grow that presence steadily.  New Tech Press's Lou Covey sat down with Tyreman to talk about Univa and where the EDA industry is headed.

 

ProFPGA unveils FPGA prototype boards

ProFPGA came out of stealth at the Design Automation Conference (DAC 2012) in San Francisco with a technology agnostic hardware system for developing FPGA prototype. ProFPGA is a spinoff from ProDesign Europe (http://www.prodesigncad.de), located in Germany. The system that can be used with Xilinx or Altera FPGAs and can be implemented with any commercially available development tool, including those from Synopsys, Xilinx or Altera.

Flexras debuts Wasga Compiler

Flexras Technologies hit the 2012 Design Automation Conference (DAC) with the Wasga Compiler, an FPGA prototype partitioning tool, with some big claims. Turns out the claims were true... sort of. Their partner, Xilinx, confirmed the tool did provide good results quickly. So the hyperbole in their press release can be forgiven. New Tech Press editorial director Lou Covey interviewed the CEO, Hayder Mrabet.

Rising custom IC costs could eat into Apple's nest egg

By Lou Covey, Editorial Director It's time to stop wondering what Apple is going to do with its cash reserve after it pays out dividends to stock holders.  If what Cadence's Tom Beckley says about the next generation of chips holds true, Apples is going to need every dime to create the next generation of processors for the iPad and iPhone.

Beckley, senior vice president of R&D in the Cadence Custom IC group, was the keynote speaker at the 2012 International Symposium on Quality Electronic Design in Santa Clara (ISQED) addressing "Taming the Challenges in Advance Node Design."  Beckley pointed out that Apple has been the poster child for cost-efficient development and production, but even if every chip developer followed the "Apple Way" it would not put much of a dent in the total cost for developing the next generation of SoCs.

The A5 system on chip in the current Apple products, designed at 45nm, could come in under $1 billion to design and bring to market with effective control of the supply line.  Cost projections for a chip at 28nm (the next step) could be as much as $3 billion. At 20nm, the cost could exceed $12 billion (if you build your own fab, which Apple could well afford.)  The Cadence exec stated that the cost of EDA tools (both purchased and developed) could run as high as $1.2 billion alone.

The evidence of the increasing costs of development can be seen in the profit margins of the iPad.  According to iSuppli, the cost of the A5 chip in the new iPads at $23 is double the cost of the original A4 chip.  Why is the cost going so high?  Because the way chips are being manufactured is changing dramatically.

Beckley explained that the physics of making a semiconductor mask reached a breaking point at the current most popular nodes as the resolution of a photoresist pattern begins to blur around 45nm.  Double patterning was created to address that problem at 32nm.  "But everyone wanted to avoid doing it at 32nm because of the mask costs.  They wanted to maximize their investment in lithography equipment."

The process splits the design where the structures are too close together, into two separate masks.  It's an expensive process (especially when each mask costs around $5 million) and requires entirely new ways of creating the masks to avoid rule violations.  But where the foundries were willing to let is slide at 35nm, they are requiring double patterning at everything below, Beckley stated.

These new techniques are driving up development costs straight up the design chain.  Beckley said he has close to 400 engineers in his unit working on tools just for 20nm design -- half of his entire staff.

The benefits of the moving the node are just as tremendous, he said.  Instead of millions of transistors, each chip will have billions allowing for greater functionality in devices.  "We expect improvements  of 25-30 percent in power consumption and up to 18 percent overall perform and improvement," he predicted.

"If what I'm saying scares you, it should.  There are many questions and issues to be ironed out," Beckley concluded.  "But at Cadence we are already working with a dozen customers on active test chips, which will increase to 20 very soon, and we are already working with customers for products at 10nm."

What are you doing to overcome the rising cost of custom ICs?  Join the discussion at element14.com

 

 

 

 

Magma acquisition will light a fire under the EDA industry

A Footwasher Media AnalysisBy Lou Covey, Editorial Director

The acquisition of Magma Design Automation by Synopsys was arguably the biggest story of the Electronic Design Automation (EDA) industry in 2011.  It will also most likely end up being the biggest EDA story of 2012 as well.

Most observers were stunned at the news, and not because it was an unlikely fit.  It's actually a great fit and gives Synopsys a virtual stranglehold on the digital IC design market. It was improbable because of the deep-set enmity between the two companies, especially between the two CEOs, Aart DeGeus and Rajeev Madhavan.  Both companies launched multiple lawsuits against each other over the past decade claiming patent infringement on a variety of technologies, all of which have been resolved prior to the acquisition but left scars throughout both companies.  As one source that spent time as an employee in both companies said, "It was personal for some people and just business for others but it was pervasive.”

The acrimony between DeGeus and Madhavan often manifested publicly.  DeGeus would often be absent from CEO panels where Madhavan was present, and Magma employees and supporters made sure the industry noted that Madhavan was often not invited when the leadership of Cadence, Mentor Graphics and Synopsys were represented.

Madhavan also perennially accused Synopsys of deliberately undercutting prices in large package deals, a charge DeGeus vehemently denied, perennially.  To Synopsys' defense, however, it was an unofficial industry-wide practice, according to Jeff Jussel, senior director of global technology marketing at element14.  Jussel is a former ASIC designer and EDA executive (including marketing director at Magma) and is leading element14's push into embedded and electronic design services.

There was so much competition between the big four it drove a never ending spiral of ever cheaper prices.  Better results, better productivity, but less and less money for the developers of that technology.  The big guys could deal with it better because they had the big "all you can eat" deals.  1 million for one year, 3 million for two years, 4 million for five years.  The deals looked bigger, but the terms were getting longer and the price per seat was coming down.  It started killing the innovation in start-ups because there was so much pressure on margins that there was nothing left to buy from the start-ups, and the start-ups were getting pushed to the side by the deals," said Jussel.  "The practice squeezed innovative start-ups out of the market because they couldn't compete and be profitable.”

Synopsys bought many failed start-ups for the cost of assets alone, eliminating competition and gaining valuable technology with little investment.  As this practice continued industry wide, investors saw little upside in funding new start-ups and, at present, there is virtually no interest in funding new companies that have no chance of an IPO nor of being acquired for a premium over investment.  This is where the acquisition of Magma may have the greatest potential for energizing the moribund industry.

First, it consolidates the industry nicely.  Synopsys holds digital, Cadence leads in analog and mixed signal, and Mentor dominates embedded design.  The lawsuits and undercutting that decimated the start-ups will be a thing of the past.  Customers will have to pay what the vendors ask or be forced to build their own solutions...or look to start-ups. Which brings us to the second reason.

At $507 million, it is the single largest acquisition in the industry's history, eclipsing Cadence's acquisition of Cooper & Chyan Technologies for $428 million in 1997.  When combined with the Ansys purchase of Apache Design for more $310 million and a handful of other smaller deal this year it helps release nearly a billion dollars of cash into the pockets of investors and founders.  All of these deals will be concluded before the Design Automation Conference in San Francisco this July.  Conservatively, the industry could see $100 million of that invested in new technology before the end of 2012.

And who will be leading that charge?  None other than Rajeev Madhavan.

Madhavan could be called the single most successful entrepreneur in the EDA industry.    He was a founder of LogicVision, a company that was sucked up by Mentor Graphics for $13 million in Mentor stock. He founded Ambit to attack the Synopsys logic synthesis hegemony, using guerilla marketing techniques to grab market share and, in the end, sold out to Cadence for a quarter billion dollars.  Madhavan reinvested much of his take into founding Magma, which went from start up to IPO in short order.  Combined with the Ambit valuation, Madhavan-founded companies account for over $700 million in corporate value.  No other single entrepreneur has those kind of results in their resume.

The sale agreement precludes Magma or Synopsys representatives' speculation on who stays and who goes.  There are those who hope Madhavan stays put for some time and he must make that commitment for the sake of the deal.  But no one believes that DeGeus will want his nemesis hanging around the office coffee bar any longer than is necessary, and he will have followers as he goes out the door.

"I expect Rajeev to be gone within days of the deal being done," Jussel stated.

That is not to say there will be a mass exodus.  According to Jussel, Magma has "some of the most intelligent and best educated people in the industry who love creating technology for IC Design.  They're working for the customers, not the logo."  Those are the people that Synopsys want to keep, and they will be very generous to them.

DeGeus has stated that the talent of Magma was what was important to Synopsys, not the technology, so does that mean Magma's tools are going away?  Jussel laughed at that question. "We'll see how that works out.  The existing installed base loves the Magma tools so they will continue to support those product lines or lose the business altogether."

But there will be business minds with wads of cash in their pockets that won't be as welcome.  The doors of start-ups will be wide open for these people.  That is very good news for an industry that has limped along for much of the past two decades and whose lack of consistent innovation has held back the semiconductor industry as well.

 Was the Synopsys Magma deal good for the industry?  Tell us why at element14.com

 

Component Engineering Community Launches

This week, a free, online community for components engineering (CE) professionals has been launched at www.componentsengineering.com. This new site offers a forum for engineers to resource information, as well tools to create and maintain a CE department within their respective organizations. The site includes free, downloadable content of procedures, processes, flowcharts, and guidelines, as well as tools and resources for learning the basic disciplines of components engineering. The site also provides resources for both fundamental and advanced component-specific education.

Douglas Alexander, the founder and principle consultant, created this website to capture and increase the knowledge of experienced of CEs.  In addition to the current content, contributions of original white papers and other related document contributions are welcomed.

“After working in this field of electronics for over 30 years, and finding no website or book dedicated to this core discipline, I was determined to develop a site giving proper recognition to the community of engineers working behind the scenes at almost every manufacturing and engineering company known today.”

The title of components engineer has been around for many years, Alexander said. There is a vast body of knowledge and capability resident in those who, for various reasons, have not worked in their field for some time but are not ready to retire. “Experience and knowledge should not be retired even if you are. Now, here is where you keep it alive.”

Retired, semi-retired, and full-time CE professionals are welcome to submit their credentials, work-experience, and working locations on the site by email. Fees are confidential between the consultant and the clients.

“There is a catch,” Douglas explained, “The individual requesting a posting as a consultant, must demonstrate a competency level by submitting white papers and/or other CE specific documentation that will be reviewed by members of the site for acceptability.” These documents will be credited to the authors and will be reviewed by prospective clients to determination of the consultant’s applicable knowledge and ultimately “worthiness” for hire.

Alexander said the site is a collaborative effort and will fulfill its full purpose as the community grows with the individual contributions from experienced practitioners. “It is my sincere desire to provide an opportunity for those who want to consult in this special field of engineering to contribute to these pages and form or reestablish peer-to-peer relationships with others of like mind and spirit.”

Smart grid adoption may rest on electronic design

By Lou Covey NewTechPress Editorial Director

The establishment of the smart grid is an inevitability, according to experts speaking at the Smart Power Grid Technology Conference, but depending on power utilities, government and “field of dreams marketing” will only delay it.  That’s why the latent industry needs the help of the electronic design community, according to speakers at the event put on by ISQED.

Edward Cazalet, CEO of the Cazalet Group, and Tom Tamarkin, CEO of EnergyCite, painted a picture, for 100+ design engineers at the fledgling conference, of an industry that is ready to spread nationwide save for public misunderstandings, governmental gridlock, and utility intransigence. Between the two presentations they offered a road map for the smart grid but that lacked a clear path to public acceptance.

“That’s why I’m here today,”  Cazalet concluded.  “We need your help to spread this word and identify how it can be done.”  Both entrepreneurs were looking for attendees to start looking into the potential of the smart grid for new product development, not unlike what came out of the PC industry in the 1980s.

Cazalet opened the conference with a description of the Transactive Energy Market Information Exchange (TeMIX). The exchange protocol makes it possible for energy providers and customers to buy and sell blocks of power at any time. That includes power utilities, power resellers and even customers with alternative energy systems that create more power than they need.  For example, an electric vehicle sitting in a garage after it reaches a full charge is essentially a block of power that can be utilized.  Offering that block on the exchange makes it possible for the car’s owner to sell that power to the grid.

“Any party can buy and sell power blocks to any other party,” Cazalet explained. “ Customer purchase blocks of power by subscription, paying extra if they use more than what they purchase or selling back what they don’t use.”

At present, however, that infrastructure is dependent on the connection of smart meters to the supplier’s power blocks and consuming devices.  While utilities have been under directive of federal and state governments to deploy these devices since as early as 2004, widespread distribution of the devices is still creeping along.

Tamarkin pointed out that the California Public Utilities Commission (CPUC) mandated the installation of smart meters by investor-owned utilities in 2004.  Southern California Edison (SCE) initiated form opposition that same year.  Tamarkin drafted and personally documents to prove the benefit to SCE in 2005, causing SCE to reverse it’s position formally and move forward on the initiative.

Tamarkin explained that the current method of billing rate payers is to provide a bill for what was consumed 90 days previous. Rate payers can only adjust their usage after the fact and hope that they are doing some good.  A completed smart grid, starting with smart meters, allows rate payers to see what their consumption is at any particular point and what they will have to pay for it.

Tamarkin likened the potential to the relationship between a car and the gas pump.  Once a consumer puts the nozzle into the tank and starts pumping, he knows exaclty what is going in the tank and how much it costs.  And the gas gauge in the car tells him exactly what his consumption rate is with some cars telling the driver if his milage is optimal.  With that knowledge and TeMIX in place, Cazalet said consumers would be able to purchase sufficient power for their needs on a just-in-time basis and utilities would better be able to predict where that energy should come from and how much to produce.

The problem, however, is the utilities have not shown much interest in completing the loop with the consuming, possibly because it doesn’t benefit them in the short run.  As Cazalet put it, “If it isn’t about generation or distribution, they don’t much care to talk about it.”

That has allowed the discussion to be directed by unknowledgeable consumer groups the base arguments against the technology on misrepresentation and isolated instances of bad installations.  For example, Joshua Hartnett , a vocal opponent of smart meter installation, based on supposed radiation issues, uses a blackberry phone that emits more radiation at Hartnett’s head than he would ever receive from a neighborhood full of smart meters. The fact that utilities and governments have been moving to correct the misrepresentations only in the past year has contributed to the lack of adoption.

Both Cazalet and Tamarkin asserted that once consumers have easy access to products that could tie into the smart grid, it would create a groundswell of demand and pressure on legislators, regulators and utilities.