Technology

Poland a bright spot in EU fiscal woes

Recently, bad economic news has been almost a daily occurrence out of the European Union, but there are occasional bright spots that miss the regular news cycle.  Poland seems to be one of them. Poland is due to become an official member of the Euro Zone in January 2012 and is obliged, under the terms of the Treaty of Accession 2003, to replace its current currency, the Zloty, with the Euro, however, the country may adopt the Euro no earlier than 2019.  That's probably good news for Polish start ups that seem to be able to find plenty of government support and venture capital for a raft of innovative technologies.

Footwasher Media's Lou Covey sat down with three Polish startup companies touring Silicon Valley recently, as they were on the hunt for partners and investors to help them expand into the US.  The three companies were Ekoenergetyka with electric vehicle charging technology, virtual environment maker i3d , and a chemical synthesis innovator called Apeiron.

This interview is the first in a series of reports and interviews on the state of European innovation and efforts of the European Commission's Digital Agenda.

 

 

A Christmas gift for that paranoid engineer friend

Do you have friends fretting over the carcinogenic properties of mobile phones?  A little company in New York has a great Christmas gift for them.  Saelig Company, Inc. has introduced the WiPry-Combo dynamic power meter and spectrum analyzer accessory for the iPad, iPod Touch, and iPhone The aftermarket device provides the touch interface not available on PC-based instruments turning any iOS device into an ultraportable spectrum analyzer and dynamic power meter.  WiPry-Combo shows RF waveforms like an oscilloscope.  Actual power output can be triggered, captured, and recorded for protocol verification or for troubleshooting wireless devices with data logging in csv format.

In the spectrum analyzer mode, WiPry-Combo can identify interference or open channels in the 2.4GHz ISM band. Operating in the frequency range: 2.400 to 2.495 GHz, it measures signals from -40dBm to +20dBm with an amplitude resolution of 2.0dBm and a bandwidth resolution of 1MHz.  The band sweep time is 200ms.

The dynamic power meter mode graphically displays RF power levels in the 100MHz-2.7GHz waveform amplitude with respect to time.

The device is available from the company for $199.95. Software, including a demo package that does not require the actual WiPry device, is available free at the Apple™ AppStore, and is available in English, French, German, Italian, and Spanish.

Sponsored by element14

Apple TV won't be AppleTV.

The growing relationship between Sharp and Apple that was revealed last week put to bed conjecture whether Apple's next leap might be into television.  It is.  And that leaves an open question: Has Apple gone mad?

The profit margin on TV's is razor thin at best. In 2007 the average screen sold for $982; this year it's $545 and, in many cases, TVs are a loss leader for electronics retailers (you make money on the cables, you know).  Apple has always been about margin and their phones, computers and tablets have had a much higher profit then just about anyone else.

While people will buy a new computer or car or phone every couple of years, they tend to stick with one TV for a long time.  The Consumer Electronics Association has HDTV penetration at 87 percent, which means anyone who wants one probably already has one.  Apple will have to find a way to convince buyers that they really need a new TV, and a technology bell or whistle isn't going to cut it. Sales of 3D TVs are in the toilet and that was supposed to be the next big thing.

Steve Jobs dropped a hint to his biographer when he said he had finally figured out how to change the TV market.  Like all of Apple's breakthroughs it had to be in the arena of the user interface demonstrated with the release of the iPhone 4 – the voice interface Siri. The Apple TV final product may not be hardware at all, but voice recognition software. And after all the years that Apple has remained steadfastly against licensing its technology, Siri could become a standard in television and a steady stream of revenue for Apple.

In the past two years, TVs have become connected to the internet, cable systems, and telephones with multiple input ports. But that has made their use even more complex for the average user.  A huge after-market industry for universal remotes relies on this complexity for their sales.  In fact, the complexity of modern electronics is the final barrier to adoption for many.

But Siri could make controlling the various functions as simple as vocalizing a request. "Adjust sound for music." "Record CSI:Miami." "Show me email."  Combining the technology with Facetime would make it possible for the user to say, "Call Mom" and start a video chat on the main screen.  The vision of a communications hub in the home could be realized, not with new hardware and a bunch of cables, but with one app.

That's pretty big.

What do you think the next big evolution in TV will be?   Join the discussion at www.element14.com

 

Does education lack perspective more than funding?

By Lou CoveyEditorial Director, Footwasher Media

Is learning from the past the key to the future, as philosopher Georges Santayana believed?  A former Lockheed-Martin CEO thinks so.

In a recent Wall Street Journal article, Norm Augustine, an IEEE Fellow who served as Lockheed-Martin's CEO  from 1996 to 1997, said that the problem with US education is not a lack of focus on science and engineering, or even economics, but on history and communication skills.

Taking aim at STEM (science, technology, engineering and math) education is a well-worn road for industry executives and gets fairly big headlines.

Earlier this past year, Google CEO Eric Schmidt, took a minute from a fairly long speech

in the UK to slam the UK education system for not encouraging science and math students .  As a result, most every member of the media in the UK and many in the US ignored 99 percent of Schmidt's text and focused on those four paragraphs out of 200  What was missed almost entirely in the coverage was the real focus of the speech: fostering innovation to boost the world economy. Even Augustine had piled on previously in a January 2011 Forbes Magazine piece blaming the lack of spending on science and technology education, as well as a lack of spending on energy technology, as reasons for the seeming dearth of innovation in the US. He claimed that the West spends more on potato chips than on energy research. According to recent data and the, however, Augustine's later position might have more validity.

When discussions arise about the state of education the focus is always on the current cuts in education from an individual level - local, state and federal, but the discussions rarely look at the whole. And that "whole" paints a very different picture.

According to UNESCO, total education spending worldwide now exceeds $7 trillion, just for 2011 alone. Total public spending on education in the United States is 24 percent of that total for 4 percent of all students in the world from elementary to graduate school – close to $2 trillion this year. Even with the cuts in the past decade, this total is greater than the totals of any other country in the world.  In fact, the US spends more than the next five countries combined.  The Institute for Energy Research estimates that The US also spends 7 times more on alternative energy technology than on fossil fuel, according to the Energy Institute of America, and 70 percent of what is spent on alternative energy is in the form of direct grants, while 90 percent of the spending on oil is in the form of loans that are repaid. (By the way, the US spends $6 billion a year on potato chips and $70 billion on alternative energy research.)

So it's not a lack of money spent on education or innovation.  Augustine points out that the National Assessment of Educational Progress shows that scores on STEM subjects (sciences, technology, engineering and math) for US high school students scores, while low according world standards, are not the students’ lowest scores.  Surprisingly enough, their best subject is Economics.  Their worst score is in History.

"A failing grade in history suggests that students are not only failing to comprehend our nation's story and that of our world, but also failing to develop skills that are crucial to employment across sectors," he wrote. "Having traveled in 109 countries in this global economy, I have developed a considerable appreciation for the importance of knowing a country's history and politics."

What seems clear is that the West is not getting what it pays for in education.  That is not a reason to reduce funding, but it is a good reason to reexamine the educational priorities.

What did the Google Chief say about the quality of UK school curriculum?  Find out at: www.element14.com

 

Fire may change the game... but not for Apple.

By Lou Covey Editorial Director for Footwasher Media

The web is awash with reviews of the Kindle Fire, many positive (some scathingly negative), and the comparisons to the iPad are just as plentiful.  The question that keeps coming up, however, is the Fire a game changer in the tablet war?  Probably not for Apple, but probably in the Android world and definitely in the remains of RIM's empire.

In the iPad comparison, the Fire is the inexpensive, entry-level tablet for noobs.  At $199  it is better than half the price of the iPad, which means people who want the media experience of a tablet at bargain prices, it's a good choice. Although Apple has released the latest version of the iPod Touch at the same price, so if the user doesn't care about the screen size, you can get a more flexible, powerful product from Apple, still. The Fire performs slower and using key pad apps will be difficult on the much smaller screen, barring significant improvements in touch technology.

The iPad, especially when paired with an after-market bluetooth keyboard, makes an effective laptop replacement.  There are even productivity apps that make it possible to use the iPad for word processing, spreadsheets and presentations. All of that is lacking in the Fire.  As far as content goes, the fire serves well as a distribution method for Amazon, but like most Android devices, it lacks the depth of apps in the iOS universe.  So Apple execs won't be losing any sleep over the sales of the Fire. Google, on the other hand... The introduction of the Fire further fragments the developer community that is divided between iOS, Android, Blackberry and even Microsoft 7 Phone (MS7).  Developers can bypass the Google Market

and deal directly with Amazon, which is great for Amazon but not so much for Google.  IDC just released a quarterly survey that shows that developers are abandoning all other tablets in north America to create apps for the Fire.  The trend seems to be going that way in Asia and Europe, as well.  So while Google was looking at Apple as their main competitor, Amazon has been snaking the market out from under them.  Yoink! The future for RIM's Blackberry is even grimmer.  The same IDC report said MS7 has now surpassed RIM as the third place tablet OS developers prefer to work in.  Along with the continuing decline in the overall device market, RIM seems to be hanging on by it's fingernails. So the Fire IS a game changer for RIM.  Their technology has just not kept up with the market development.  The Playbook was a joke, a little less funny than HP's tablet.

RIM is going nowhere... except into someone else's division. RIM still has a lot of value.  They have a pretty loyal customer base, albeit shrinking. They have that bag of Nortel patents in wireless technology, the best security platform and the best integration of MS Exchange and Lotus notes.  Microsoft could become a serious competitor to Android and iOS if they bought RIM, and that would change the game for everyone.

Sponsored by element14.com

Can Solar survive Solyndra aftermath?

By Lou CoveyEditorial Director, Footwasher Media

The recent collapse of a few high-profile solar energy companies, like Solyndra and Beacon Power, has caused even the most ardent fans of alternative energy to ask, "Can this industry survive?"  The answer is a resounding, yes and no.  It all depends on what government on all levels does.

Current public impressions of the health of any industry are colored by recent history.  The financial failings of companies and industries considered "to big to fail" are what most people think of when hearing news about solar.  But unlike the auto industry, with a population of three major players, the solar industry is filled with hundreds of start-ups struggling to establish themselves.  Even if one, two or two dozen go down, it is still well populated.

"Although panel manufacturing is in trouble, the solar industry is doing relatively okay." said Chirag Rathi, a senior consultant on the energy industry for Frost and Sullivan. "This is largely due to the advent of solar leasing companies in the U.S. One such company, SolarCity, was even give a contract to install solar power on up to 160,000 military homes. The program was supposed to be supported by the Department of Energy (DoE), which had extended a conditional commitment for a partial guarantee of a $344 million loan to support the project."

Government subsidy and purchase are the key to whether the industry thrives. The DoE recently announced a new initiative to fund solar collection technology development and the Department of Defense (DoD) is under congressional mandate to reduce fossil-fuel consumption by 50 percent.

The reality is that all forms of energy production are heavily subsidized by government throughout the world.  China has invested hundreds of billions of dollars in their solar panel industry.  Spain's financial difficulties are directly tied to the 100 percent subsidy it gave to the industry there, that it can no longer support.  Even Germany, relatively healthy in the world economy, is struggling to maintain its levels of support to the industry.  In the US, most of the government support – Federal, state and local – is actually tied to the installation industry.

"The purpose of government subsidies for renewables is to reduce costs and make them economically viable alternatives to fossil fueled electricity generation." said Jay Holman, research manager for solar energy strategies at IDC. "As the cost of electricity from renewables drops, it is natural that the subsidies drop as well: this is an indication of progress. The trick with subsidies is to encourage industry growth without placing too heavy a burden on electricity ratepayers or taxpayers. A flat, constant subsidy won't do the trick: it needs to drop in line with falling costs."

Holman said Germany and Italy automatically reduce subsidies based on the amount of solar installed in the previous year, which provides transparency and predictability for the market.

"In the US, however, we send the issue back to congress every few years and let them duke it out. That is an incredibly inefficient approach that makes the subsidy situation extremely difficult to predict."

Holman concluded that what the US industry needs is a long term subsidy plan that makes automatic subsidy adjustments based on the rate of installations and/or the cost of electricity from renewables.

Solyndra collapses.  Why are the generals smiling?

The fall of Solyndra was expected, and the DoD is happy

By Lou CoveyEditorial Director, Footwasher Media

The collapse of Solyndra has been the subject of both major news coverage and a foundational bit of political discourse recently.  A closer look at the facts reveals that the reality of Solyndra and the solar industry is far from the speculation, especially when viewed from a military perspective.

In the wider scope, industry analysts and observers wonder what all the kerfuffle is  about because everyone who knew the industry knew that Solyndra was not going to make it, especially in the current market.

"Solyndra's CIGS solar panels were expensive," according the Chirag Rathi of Frost and Sullivan. "The technology was innovative when it started out 6 years ago, but the global market place changed so fast in this time period that it became incredibly difficult for them to compete on price.  Their per watt production cost was widely believed to be above the $6 mark, much higher than the poly-crystalline technology of $1.75 per watt and falling."

According to the industry rule of thumb, for alternative energy to be competitive with fossil fuels, the cost per watt needs to fall below $1.

Rathi pointed out that the solar panel industry is in oversupply with the massive capacity coming out of China and Taiwan. "The Chinese government has provided more than $30 billion in soft loans to the domestic panel manufacturers."

With all this common knowledge, the persistent question has been: Why did the Obama administration push forward with the loan program?  The first answer is, well, that's been the way things have been done for some time.

Contrary to conventional thought, alternative energy gets the lion's share -- by far -- of any government investment in energy, including fossil fuels.  According to the Institute for Energy Research, direct federal subsidy (that's cash, not tax incentives) for renewable energy topped $14 Billion in 2010, while total subsidy of fossil fuel (gas, oil and coal) was just under $3.4 Billion... and 90 percent of the latter was in tax incentives, not actual cash payments.  And since the Solyndra investment was only in the form of loan guarantees, it won't come out of the federal budget until the bankruptcy is complete.  In other words, the fall of Solyndra has not yet cost the government anything.

So what, specifically, did the government get out of the Solyndra deal?  That's where no one is looking, and where you need to look to find the more interesting story.

Find out why the generals are smiling at Element14.com

 

Making ID verification affordable

Running a business, large or small, has become fairly complicated in recent times, especially because in so many cases, you have to verify that the person you are dealing with is who they say they are. One of the most difficult, annoying and controversial is dealing with identity theft and hiring documented workers.  Despite popular belief that hiring undocumented workers is rampant, most companies do comply with the law and check submitted documents according to established procedure.

However, that doesn't stop lawsuits from parties fired for submitting false documents, as former California gubernatorial candidate Meg Whitman discovered.  It also doesn't stop people, whose identity may have been stolen by a potential employee, from suing companies that hire the frauds.  While no court has held a company or employer liable for accepting false documents unknowingly, it still creates an undue financial burden to defend themselves.

And in the political climate developing, it appears the burden of proof will soon be switching from the potential employee to the employer.  In the United States, an employer can now face significant jail time plus a fine of up to $250,000 per count for hiring workers without proper documentation.  That is a powerful incentive to thoroughly check identities.

All that makes verifying identity is big business and, so far, most of the investment has been going toward government agencies.  For example, the US Transportation Safety Administration (TSA) this month is spending millions of dollars to test technology allowing agents to spot falsified documentation.  In India, the government is working on identity-theft resistant ID cards for the poor to allow them to get aid.

On the corporate level, there are multiple companies that provide that kind of service that can end up costing several thousand dollars a year, which makes it effective for large corporation.

On the private-citizen/local-business level there are several products and services available to verify customer identities, check the authenticity of document formats and do background checks on individuals, but they are most have separate functions.  Some check the validity of passports, some credit cards, some drivers licenses.  Some do only credit checks.  So again, if you purchase all the equipment, it can end up costing several thousand dollars. The problem with a technology-only focus on verification is that it is not always accurate.  False positives and false negatives can occur more than is acceptable, which means the human element cannot be totally eliminated.

A very small number of companies combine expertise and technology to handle multiple applications.  One such company is IDChecker in the Netherlands.  The company offers a cloud-based and hands-on approach to document verification that includes passports, drivers licenses, national identification cards.  Its also looking into the potential of providing personal identification codes that can fast track users through verification processes for e-commerce and and travel. IDChecker also employs identification experts, many hired out of government service to "eyeball" scanned documents.  The services comes at a premium but when absolute accuracy is required, nothing can beat the human eye.  At least not yet.

imPARTing Knowledge 3: Understanding Component Engineering - Microcontrollers

By Steve Terry, SK CommunicationsAdvisor to ComponentsEngineering.com

Many small board designs benefit nicely from the use of a microcontroller.  But selecting an appropriate one for a particular design often brings on the feeling of "Where do I begin?"

This discussion limits its focus to low-end microcontrollers.  For this purpose, we'll stick with 8-bit devices.  8-bits simply means that internal processing only operates on 8 bits at a time.  As one would expect, 16- and 32-bit micros would operate much faster as they are processing more bits of data with each instruction.  To be sure, much of the same thinking applied to 8-bit microcontrollers can be applied to the 16- and 32-bit devices;  however, cost, size, capabilities, performance, feature integration, and a host of other upscaled attributes quickly make it increasingly difficult to generalize on approach and applicability.

That said, even in the 8-bit microcontroller world, there are many highly specialized devices.  So, to avoid confusion, we'll leave that subject for a future discussion and stay with the garden variety parts for now.  Quite often, if your design truly calls for one of these specialized micros, there's not going to be much choice, and you'll likely be familiar with those choices already, so you should be okay.

What is a microcontroller, anyway?

The key trait that distinguishes a microcontroller from a microprocessor is that it's a microprocessor with a smorgasbord of built-in peripherals.  For relatively simple board designs, such as controller boards, those embedded peripherals can save a lot on design effort and BOM (Bill of Materials) cost.  Microcontrollers are commonly referred to as MCUs (for "microcontroller unit");  it's nice and short and kinda rolls off your tongue, so we'll use it here, too.

Base MCU feature sets typically include three types of memory (flash, RAM, and EEPROM), general purpose I/O (GPIO), support for various communications ports (UART, I2C, CAN, etc.), timers/PWMs, ADCs, DACs, internal oscillators, temperature sensors, and low power management.  From there, the feature sets branch out widely.  And this is really where the details come in to play for component selection.

Establishing requirements

With so many vendors and varieties of low-end micros, you may find it surprising that a good percentage of them will likely satisfy your design requirements.  But even though so many will usually do the job, tailoring the selection tightly to your particular needs and preferences can make for a much smoother ride in the long run.

Generally, the first step is to define what functionality you must have.  For example:  How many GPIO pins? (always trying to include a few spare for those late design changes).  How many ADC or DAC channels and with what resolution? Do you need timers or PWM control?  How many?  8- or 16-bit?

How do you need to communicate to other devices on this board or another board, like I2C or SPI?  Keep in mind that it's always useful to bring a UART off the board for an RS232 debug port that you can connect to a terminal emulator on your PC.  And any components added to the board to support  it can generally be left off in volume production.

How much code space do you think you'll need?  And how much RAM?  (Here, we don't consider that you'll need so much extra of either one that you'll need to add external memory devices.)  If you're not really sure on memory requirements, err on the high side since:

1. running out of memory can seriously impose on the ability to implement those last few features the marketing guys said they really want included, and

2. you can generally downsize the part later if it turns out you have more memory than you need – maybe do this as part of a cost reduction board spin.  Or, quite often (and if you plan it carefully), it will be a pin-compatible part, so it's simply a BOM change.

And, well, there's one more good reason that consistently proves prophetic:

3. Murphy's Law Corollary:  Code size grows to the size of available memory + 1 byte.

Feel like you're ready to pick one?  Read the rest at: element14.

 

OctoBox eases testing of MIMO devices

Dropped calls on cell phones due to faulty antenna placement have been selectively publicized, as in the case of the Apple iPhone 4G, but have been a common occurrence in all phones released in the past two years.  Mobile carriers are putting heavy pressure on manufactures to avoid, if not eliminate the problem as soon as possible.  No, actually they want it done now. That puts the problem squarely in the laps of the test and measurement industry, which is meeting the demand with some alacrity as demand for the products increases and new technology boost speeds and transmission rates are coming online.

Of keen interest to product developers are compact solutions that test engineers can keep in their offices or at least within spitting distance.  Companies like Agilent, Aeroflex and Anritsu are providing several desktop solutions.  A small company in Boston, OctoScope, has pulled the wraps off a refrigerator-sized anechoic chamber, the OctoBox, that can test mobile devices without having to solder coax directly to the device antennae and deliver more real-world results.

"Lab testing with the devices’ actual antennas, even when the radios are not MIMO, is better than soldering coax to the antenna connections," said Charles Gervasi, an engineer with Four Lakes Technology in Madison, Wisconsin.  "For functional test in production, an over-the-air test is the only option.  Automated test equipment can be configured to test multiple devices at once in the chamber." (Read Gervasi's full review of the OctoBox at element14.) http://www.youtube.com/watch?v=J60uC-7xKsY

Can we survive the loss of Steve Jobs?

By Lou CoveyEditorial Director, Footwasher Media

In the outpouring of grief over the death of Apple founder Steve Jobs has been an underlying meme of concern regarding not just the future of Apple, but the potential for disaster in the semiconductor industry.  On one side are those people whose fortunes ride on continued success of Apple, while on the others are those that would prefer the current Apple leadership on consumer electronics and applications be blunted in favor of their own.  It makes it difficult to have an objective opinion one way or another.

One of the issues to consider is that Apple is now the largest buyer of chips in the world.  If Apple falters significantly in the near term, there is concern that the current growth of the chip industry could falter as well.  But is that true?

In June 2011, Apple surpassed HP as the largest buyer of chips, without a significant reduction in the amount purchased by HP.  What wasn't widely correlated was in July, Amazon also surpassed HP, driving the latter to third.  The phenomenon of Apple's iPhone and iPad has launched a massive buying season by other companies working to take a bite of Apple.  Should Apple's sales falter in the near term the market demand for competing products will probably take up the slack.

Another meme is disappointment in the latest announcement of the iPhone 4s, which turned out to be nothing but the speculation of uninformed bloggers, and the continued "delay" of the iPhone 5.  Apparently, the buying public wasn't as disappointed as pre-orders of the 4s have topped that of the 4 announced last year.  However, pundits seem to be missing critical pieces of information that could explain why Apple made an incremental rather than a radical advancement.

First, is the issue of Samsung.

Samsung, the largest tech company in the world by sales, is competing directly against Apple in both the tablet and mobile phone markets and is probably the leading competitor depending on who you talk to... but it is a distant competitor.  And Samsung's profit forecasts are tied directly to that competition in two ways: as a competitor and a partner.  Samsung also manufactures the A4 chip for both Apple's product lines.  Samsung downgraded its projected profit forecast at the beginning of summer in phones and tablets anticipating sales chilling from the iPhone 5.  When it was revealed that the 5 was yet to come to the market, Samsung's profit forecasts and actual profits rose. Second, there's the investment Apple has made in the current design.  A source close to Apple said the company invested $1 billion in manufacturing for the iPhone 4 and 4s. So walking away from a manufacturing investment and then announcing a new product that would hurt an important supplier, doesn't make a lot of sense -- especially when the current product, with minor tweaks, is blowing the doors off everywhere with the help of three distribution channels (AT&T, Verizon and Sprint).

So the "failure" of Apple to deliver the next generation of its killer product line does not portend the ultimate failure and beginning of the end of its dominance.  It's merely a smart business decision.

Finally, and the biggest question of all has been: "Can Apple actually survive, much less thrive, without Steve Jobs calling the shots."  The reality is that Jobs has not been calling the shots on his own for quite a while now.  A team of people, hand picked by Jobs prior to his first medical leave, have been the overall leadership.  During that process, Tim Cook emerged as the successor, just weeks before Jobs succumbed to his illness.  Many are blaming Cook for the less than stellar reception to the product announcement, but if the truth be told, there were moments as early as 2005 where even Jobs' decisions were questioned and identified as the beginning of the end for Apple's success.

A closer analogy to Apple's situation is when Bill Gates stepped down and installed Steve Ballmer as the new head of the company.  Many questioned that move, as well, but were comforted that Gates continued as Chairman of the Board.  With Gates keeping his finger in the pie while Ballmer led, Microsoft has lost half its value.  The difference between the two situations is Apple now has a clean break from Jobs' leadership allowing Cook, et al to create a new future for the company.

There is not enough data to determine if any one event, even one as earth-shattering as the death of a charismatic and visionary leader, will mark the finale of a remarkable business run, but this is what we do know:  Apple has products in completion to launch for the next 5 years; they have $76 billion in cash reserves; and have the largest valuation of any US company.  With that kind of foundation, the odds are that any speculation that focuses on one event or issue is as sure as a throw of the dice in a back alley craps game.

Can we learn from Job's life?  And can we do something positive now?

imPARTing Knowledge 2: Resistors

By Douglas Alexander and Brian Steeves, CE Consultantshttp://www.componentsengineering.com

My first up close and personal experience with a resistor was a 9V transistor radio in the middle of a Dodgers baseball game in 1959.

I was listening to the game in the fourth grade with the earphone wire snaked from my jeans' front pocket, through my belt, (first historical use of a strain relief), under my sweater, up through the button hole in my collar, (second historical use of a strain relief), to insert the plug end, inconspicuously, as close as possible to my inner ear, cleverly designed to avoid the teacher’s detection.

All of a sudden, I began to smell something burning. Then I lost the audio entirely. At recess, I opened the back of my transistor radio to discover a cylindrical device with colored stripes printed around the cylinder with one of the two red stripes partially obliterated by a brown-black burn site that extended to the board and wiped out the “R” before the 10 label. I missed the end of the game but I did catch the end of my radio. Thus, my first experience with troubleshooting had begun with my nose and had ended with my eyes. Later in life, I heard my electronics professor telling me that I had discovered the first two steps for troubleshooting any circuit.

From that point on, I have developed a long-term professional relationship with the resistor, the most commonly used discrete device in electronics.  The following is a discussion on some of the various resistor-based applications commonly in use today. This is not an exhaustive list, and the reader is invited to suggest other applications.

When placed in series with each other, with a tap connection between them, resistors are used as voltage dividers to produce a particular voltage from an input that is fixed or variable. This is one way to derive a bias voltage. The output voltage is proportional to that of the input and is usually smaller. Voltage dividers are useful for components that need to operate at a lesser voltage than that supplied by the input.

Resistors also help to filter signals used in oscillator circuits for video, audio, and many other clocked circuit devices. Used together with capacitors, this is known as an “RC” circuit, and the oscillation is a function of the two interacting to produce a time constant.

Because the flow of current through a resistor converts electrical energy into heat energy, the heat generated from the high resistance to the flow of the current is used commercially in the form of heating elements for irons, toasters, room heaters, electric stoves and ovens, dryers, coffee makers, and many other common household and industrial products. Similarly, it is the property of resistance that causes a filament to “glow” in light bulbs.

Current shunt resistors are low resistance precision resistors used to measure AC or DC electrical currents by the voltage drop those currents create across the resistance. Sometimes called an ammeter shunt, it is a type of current sensor.

Resistive power dividers or splitters have inherent characteristics that make them an excellent choice for certain applications, but unsuitable for others. In a lumped element divider, there is a 3dB loss in a simple two-way split. The loss numbers go up as the split count increases. Splitters can be used for both power and signal.

Resistors can be used in stepped configurations when they are tapped between multiple values or elements in a series. In the absence of a variable resistor (potentiometer), connecting to the various taps will allow for different fixed resistance values.

Using high-wattage wire-wound resistors as loads for 4-corner testing is a common practice in the qualification process of a power supply. By varying the line voltage and the resistive load to all four extremes, low line, high line, low load, and high load, a power supply’s operating limits can be determined.

(), invented and patented by the African-American Engineer Otis Boykin,  are also ideal for compensating strain-gauge transducers. They offer the necessary accuracy and perform reliably at high temperatures. They are designed to minimize resistance value change, or to change in a controlled manner over different temperatures. Wire-Wound resistors are made by winding a length of wire on an insulating core. They can dissipate large power levels compared to other types and can be made with extremely tight resistance tolerances and controlled temperature characteristics.

Typically, a single, one-megaohm resistor is used with an antistatic or ESD wrist strap for safely grounding a person working on very sensitive electronic components or equipment. The wrist strap is connected to ground through a coiled, retractable cable with the 1 mega ohm resistorin series with ground. This allows for any high-voltage charges or transients to leak through to ground, preventing a voltage buildup on the technician’s body and thereby avoiding a component-damaging shock hazard when working with low-voltage tolerance parts. This Transient Resistor, formula symbol, is commonly referred to as a “mobile ohm”. These are usually designed in with a VoltsWagon pulling them.

Can I get a groan from someone? Author’s note: This is not an original pun, but I would rather diode than young. Author’s note: That was original. Sorry, I just couldn’t resist—or could I?

Passive terminators consist of a simple resistor. There are two types: (1) a resistor between signal and ground like in Ethernet, or (2) a resistor pair, one from the positive rail to signal with another from the signal to negative rail. Terminating resistors are widely used in paired transmission lines to prevent signal reflection.

See the entire article at Element14.com

 

 

 

 

FPGA verification must address user uncertainty for prototyping, system validation

By Loring WirbelSenior Correspondent, Footwasher Media

The recent expansion and diversification of the FPGA verification market bears a certain resemblance to the ASIC verification market of 20 years ago, though beset with opposite challenges, thanks to the changes wrought in 20 years by Moore’s Law. When companies such as Quickturn Systems created large logic emulation systems to verify ASICs in the early 1990s, users had to be convinced to spend significant amounts of money while dedicated floor space equivalent to a mainframe, all to verify system ASICs. Today, FPGA verification can be addressed in add-in boards for a workstation, or even in embedded test points within the FPGA itself.

But even as customers in 1990 were reticent to move to logic emulation due to price tags, today’s FPGA verification customer may show some trepidation because such systems may seem simplistic, invisible, or of questionable value. In many cases, however, FPGA users dare not commit to multiple-FPGA systems (or to ASICs prototyped with FPGAs) without these tools. Newer generations of FPGAs, incorporating the equivalent of millions of gates, integrate RISC CPUs, DSP blocks, lookaside co-processors, and high-speed on-chip interconnect. Verification of such designs is a necessity, not a luxury.

Click here to read the full text of the FPGA verification must address user uncertainty for prototyping, system validation article.

S2C bridges HW prototyping and SW development

As FPGAs have become larger their use as a prototyping tool has become more diverse, including using multiple processors in a single design and system. And the business of FPGA prototyping has grown with that ability. What began as a means of prototyping other silicon devices, has become a way to validate the FPGA itself, an indication of how the FPGA verification market can be used in bootstrapping a next-generation FPGA based on known designs. S2C is one of the companies that is making a profitable business in this niche as this New Tech Press Report demonstrates.

 

Tektronix moves to integrate single-chip functionality through instrumentation

Early in the summer of 2011, Tektronix acquired a small company, Veridae to move it's instrumentation business directly into the world of FPGA prototyping and verification.  This NewTechPress interview with Brad Quinton, founder of Veridae and now in charge of Tektronix' embedded instrumentation group shows how the company sees the potential of the market and is driving a strategy toward reality.

Latest Embedded Technology Trends

Over recent years, embedded systems have gained an enormous amount of processing power and functionality. Embedded computing is seeing a definite trend in migrating to 32-bit, 64-bit and from single to multicore processors. Embedded systems meet their performance goals, including real-time constraints, through a combination of special-purpose hardware and software components tailored to the system requirements.  

View the full article at http://www.element14.com/

Hear what other engineers are talking about in Embedded Technology Trends at: http://www.element14.com/

Fundamentals of volatile memory technologies

BY BRIAN DIPERTPrincipal Sierra Media

RAM, i.e., "random access memory," is a commonly interchanged term for "volatile memory," i.e., memory that loses its stored data when power is removed. Yet RAM is not an acronym with which I'm particularly fond. Several decades ago when first coined, it admittedly was reasonably accurate, in contrast to ROM (read-only memory). ROMs, specifically mask ROMs (programmed at the fab) and PROMs and EPROMS (programmed in dedicated lab bench hardware, or on an assembly line), were random-location readable but not random-location writeable, at least in-system. RAMs, on the other hand, could be randomly read and written, thereby leading to the more general "random access" qualifier.

To read the entire article: “Fundamentals of volatile memory technologies see http://www.electronicproducts.com

Inverters: There's more to solar power technology then just panels

You have your solar panels and you know you are supposed to connect them to your electrical system, but what do we know about inverters? What's best for your solar power installation? Micro-inverter? Mini-inverter? String? Ground mount? Didn't know there were differences? Check out what IdaRose Sylvester, senior correspondent for New Tech Press, learned from from Direct Grid Technologies and you will. http://www.directgrid.com. Sponsored by element14

Jigawatts and miniJoules! (What the heck's a miniJoule?!)

We've all heard that solar panels are expensive, difficult to install, and need large installation sites to make them cost effective. Join us as Ida Rose Sylvester, Senior Correspondent for New Tech Press talks with Andre Steinau, Director of Minijoule, as they discuss a unique way to overcome some of the challenges associated with solar power. What the heck's a miniJOULE you might ask, miniJOULE is a small solar power plant you can put together yourself, generating some electricity to offset your daily use.