Lead Article

Bad data and overwhelmed professionals impede successful employment

The conclusion of a year-long New Tech Press investigation of the employment industry shows is it is failing job seekers, employers and the investment community. Some of these services can point to a few areas of success and some are better than others, none are particularly effective. In fact, if a job seeker were to place on roulette bet or buy one lottery ticket for every job they apply for on a job listing service, they would be more likely to win a living wage from gambling than job searches. Employers and the employment industry rely on the Standard Occupation Codes (SOC) and North American Industry Classification System (NAICS) to form the foundation of job descriptions. In our interviews with HR professionals and system developers we discovered that when they are creating job descriptions they consistently overuse and misuse the coding system to pull in the most candidates, even those that are unqualified.

Second, the “artificial intelligence” built into the systems is not much more than a scan of those codes and word search functions, resulting in bad returns.

In making this discovery we input the following job functions into a dozen job-finding sites (e.g. Monster.com) and employment sites (e.g. Facebook)

  • Journalist
  • Editor
  • Marketing communications
  • Marketing manager
  • Audio design
  • Video editing
  • Program management

The results produced, at most, no more than two positions that were actually for those jobs and hundreds of positions that ranged from electronic system designs to sous chefs. The most egregious example is a standard search for audio designer jobs. This particular job is crucial in television, movie, and video game industries and is the key focus for the broadcast and electronic communications major at San Francisco State University. Using that job title in a search for jobs in a dozen job finding sites (e.g. Monster.com) resulted in more than 5,000 positions for electrical engineers and computer scientists and not a single job for an actual audio designer.

To see why we received these results we dug into to SOC and NAICS codes appended to the job offering and found dozens in codes that had absolutely nothing to with the listed jobs. The reason for these errors was simple: When a company is looking for employees they input the codes related specifically to their industry or discipline. When a tech company, like Google, is looking for an engineer to design an audio codec.  They append the NAICS 541400 code for specialized audio systems design to the posting. The systems and HR professionals do word searches for “audio” and “design” and make the posting. So an audio design engineer who produces sound, get thousands of job listings for semiconductor designers who have experience in audio codecs.

Neither the NAICS nor the SOC systems are adequate sources of data for effective job placement efforts, yet these codes are foundational to all job postings.

The next problem is human fallibility. 

In spite of the flood of automated job sites and technology, all of which we have found to be horribly flawed, the HR industry is dependent on humans that are deluged by unqualified applicants fed to them by the flawed technology. Thos professionals are well versed in the legal requirements of their profession but lack basic tech understanding and and under use relevant technology.

We talked to 27 in-house and independent recruiters over the past year, some of them senior HR managers and vice presidents. None of them knew the capabilities of the search technology, much less the full capabilities of artificial intelligence. Many of them were flummoxed about the use of simple spread-sheet tools. As a result they do not use the basic automation tools available even for free and are generally overwhelmed by the amount of communication they have to handle daily from employers and potential employees. One manager for a major marketing automation system company did not use her own company’s technology for job openings. In another case, a recruiter for a social media company did not even have a profile on the company platform.

The combination of bad and misused data, and lack of basic tech understanding results in an ineffective mechanism for matching qualified people with jobs and, hence, the high number of qualified people leaving the job market altogether.

As frustrating as this may be for employers and job-seekers, it must be even more frustrating for the investment community. More than $4 billion has been invested in HR tech startups in the past two years and there is no end in sight. Companies and seekers are paying subscriptions to these companies in the millions of dollars and yet with little positive results. When combining the current unemployment rate with the number of people who have left the job market, the effective unemployment rate in the US is 40 percent, with no relief in sight.

What can be done? Start with the data.

The SOC and NAICS codes were not intended to help people find jobs. They were designed to classify maintain records of employment for the purpose of population studies and taxation. Using them for job openings is a gross misuse of the data. The HR industry needs to employ experienced communicators and data scientists to develop and sort job opening data. The communicators can create accurate and realistic offerings and the data scientists, using sentiment analysis, can identify and process the recruits more effectively.

Secondly, marketing automation and proper SEO can attract, sort and communicate effectively with applicants making HR professionals more efficient and productive by eliminating frustration and wasted effort.

Finally, employers need to have an attitude adjustment regarding what they are looking for in potential employees. As the unemployment rate falls, it will be more difficult for overly selective companies to find productive employees. Instead of investing in tech based on bad data, invest in training of HR staff to do better work.

Energy storage is big business, but is it safe?

By Lou Covey, Editorial Director

Judging from the number of companies that exhibited at the Intersolar/EES joint conference in San Francisco, energy storage is the next big thing in green energy. It resolves the problem of intermittent over-and under-production that plagues sources like solar and wind. It comes, however, at a high cost to the pocketbook, the environment and personal safety. The good news is that there are alternatives to conventional storage technology. The bad news is those technologies are only now coming to market and are facing an uphill battle with the technology status quo.

SAN FRANCISCO, CA - Intersolar North America Conference at Moscone Center West, Tuesday July 8, 2014.
SAN FRANCISCO, CA - Intersolar North America Conference at Moscone Center West, Tuesday July 8, 2014.

First, let's look at the most popular technologies dealing with storage and why storage is even necessary in the first place.

Wind and solar are the most popular sources of green energy today, but they are also two of the most inconsistent, inefficient and costly sources of electricity. Wind only produces power when the wind is blowing and only at a narrow range of wind speed. If the wind blows too fast or too slow, your wind turbine becomes, essentially, useless. Solar produces power most effectively between the hours of 11 a.m. and 4 p.m. which just happens to be the same off-peak hours when less electricity is needed. Production from solar panels disappears during peak demand times. Adding a storage technology allows systems to save up electricity during peak production times providing "clean" power during peak demand. Because most systems are designed to store between 48 hours to a week of electricity without recharge, they also serve as a valuable source of energy when the wind doesn't blow or the sun doesn't shine.

The two most common forms of storage technology are lead-acid batteries, similar but larger than the kind of batteries in internal combustion cars, and lithium-ion (Li-ion) batteries, like those used in electric and hybrid cars and mobile devices. The average lifespan for these two popular technologies is about 10 years. That's where cost comes in as a significant factor.

Since windmills and solar have a stated maximum-generation lifespan of 20 years it means the batteries must be replaced or enhanced at least once during their lifetime (just like you do in any other device) long before the generators die. Both technologies require significant and sophisticated hardware and software technology to charge, maintain and manage the flow of power efficiently and safely. When you combine all that together you get a cost of $1000-$2000 per kilowatt/hour. The average system needs to hold a minimum of two days worth of power, so a 5 kW/hr solar storage product will cost between $10,000 and $20,000 for 10 kW of storage. That's added on top of the $30,000 the energy system cost, and, as stated, it will have to be completely replaced at least once during the life of the energy system. The minimal cost for energy storage, then, is more than half of the cost of the energy system. Even with subsidies that puts the cost of green energy out of the hands of many residential and commercial users. Storage systems have other significant costs that should not be overlooked.

Now let's consider the environmental costs.

Stationary lead-acid battery system (SLABS) have a significant list of environmental compliance, enforcement, and liability concerns because the active ingredients are sulfuric acid and lead -- two of the most toxic substances on earth. A spill may result from the improper handling of hazardous material discharge or a slow and undetected corrosive breach in the battery housing that can cause injurious, if not lethal exposure to employees, workers, or tenants. Overcharging a battery, due to a failure in the software control system, can result in the release of hydrogen sulfide gas, a colorless, poisonous, flammable substance that smells like rotten eggs.

Lead is highly toxic metal and once the battery becomes inoperative, it must be properly collected and recycled. A single lead-acid battery disposed of incorrectly into a solid waste collection system, and not removed prior to entering a resource recovery facility for mixed waste, could contaminate 25 tonnes of waste and prevent the recovery of the organic resources because of high lead levels.

Sixty-four percent of all the lead produced through mining goes into lead-acid batteries and the harmful effects of improper recycling are giving (even in third-world countries where regulations are less stringent) pause in using them from energy storage. The government of India is mandating the elimination of the technology for anything but automotive use. The reason for the ban is that lead from storage batteries placed in unlined landfills is contaminating groundwater.

Lithium Ion

Li-ion batteries have advantages over lead-acid in that they are smaller and lighter making them preferable to electric cars and mobile devices, but they are in their relative infancy in large-scale use.

“We are at the very beginning in energy storage in general,” says Phil Hermann, chief energy engineer at Panasonic Eco Solutions. “Most of the projects currently going on are either demo projects or learning experiences for the utilities. There is very little direct commercial stuff going on."

Moreover, Li-ion is highly unstable and without proper power management, usually via a software solution, they tend to burst into flame (e.g. the hoverboard). Every vendor we talked to at the EES conference dismissed safety concerns, essentially saying that the problem exists with other vendors, not them.

One company, a start up called ElectrIQ Power, provides a turnkey system in a box including the batteries, inverter and control software, which is touted as their claim to safety superiority. Like all other Li-ion system suppliers the warranty for their systems is for 10 years. At that point the battery capacity is 60% of what it was when new, requiring the purchase of a new system or an additional 5 kWh booster pack at the end of the 10 year period.

However, at present they have no plans on helping customers deal with the disposal of the batteries at the end of their useful life. No other Li-ion vendor could answer questions about eventual disposal either.

Video: ElectrIQ Power simplifies home energy storage

Here's an interview with the founder of ElectrIQ Power:

Electric Power is a software company integrating and managing multiple energy storage technologies into a single unit. The system is based on standard Lithium-ion battery technology with hybrid investors.

But beyond safety, the environmental issues facing the production of LI-ion is most troubling. An EPA 2013 report concluded that batteries using lithium, nickel and cobalt, have the “highest potential for environmental impacts”. It cited negative consequences like mining, global warming, environmental pollution and human health impacts.

Take, for example, the Tesla factory near Reno, Nevada. As Nevada is the only source of elemental lithium in the United States locating the factory in that state was an obvious choice and the Nevada government has always been open to toxic industries. Elemental lithium is flammable and very reactive. In nature, lithium occurs in compounded forms such as lithium carbonate requiring chemical processing to be made usable. Typically found in salt flats in areas where water is scarce, the mining process of lithium uses large amounts of water. Toxic chemicals are used for leaching purposes, chemicals requiring waste treatment. There are widespread concerns of improper handling and spills, like in other mining operations around the world. Even in first world countries, Li-ion battery recycling is in the single digit percent range. Most batteries end up in landfill.

Finally, we have nickel-cadmium batteries (NiCd). This is a very old technology that has been in in commercial production since the 1910s. While not as expensive as Li-ion and have more recharging cycles, they are bulkier, have lower power densities, and must be completely discharged before recharging, They also survive longer then Li-ion and have not been know to explode when overcharged.

Ni-Cd batteries have been used early energy-storage applications. For example the 27 Megawatt wind farm operated by the Golden Valley Electric Association In Alaska has used a 3 Megawatt Ni-Cd system stabilization on the island of Bonaire since 2010. But their lack of density and the need to completely discharge power before recharge has made them less valuable to the renewable energy industry. NiCd mining and production is just as toxic as Li-ion and recycling is so toxic they have been banned in the European Union.

So, when it comes to adopting the most popular forms of energy storage in the world the question the market needs to answer is, "how much do we want to damage the environment for it?"

There are alternatives, though. We look at those next.

To be continued...

Wind power has a cost... in human life

This is the next chapter of our series on energy production. We take a look at wind power, it's history, application and challenges. The first time wind power was put to use was in the sails of boats and for more than two millennia wind-powered machines have been a cheap source of power for food production and moving water. It was widely available, was not confined to the banks of fast-flowing streams, and required no fuel. The Netherlands used wind-powered pumps to push back the sea and wind pumps provided water for livestock and steam engines for over a century.

With the development of electric power, wind power found new applications in lighting buildings remotely from centrally generated power, birthing the concept of distributive power systems. Starting in the 20th century saw wind plants for farms or residences and larger utility-scale wind generators that could be connected to electricity grids for remote use of power.

By 2014, over 240,000 commercial-sized wind turbines were operating in the world, producing 4% of the world's electricity. Today we hear news about wind turbines delivering almost all the energy needs for countries like the Netherlands and Germany... for one or two days a year.

What they don’t report as often is the failure rate of those turbines and the loss of life associated with them.

Approximately 120 wind turbines catch fire every year in the UK alone, according to a joint 2014 engineering study at Imperial College London and the University of Edinburgh. Beyond fire there are multiple accidents that don’t result in system failure but do result in the death of engineers servicing the systems. In England, there were 163 wind turbine accidents that killed 14 people in 2011. Wind produced about 15 billion kWhrs that year, so using a capacity factor of 25%, that translates to about 1,000 deaths per trillion kWhrs produced (the world produces 15 trillion kWhrs per year from all sources). Even using the worst-case scenarios from Chernobyl and Fukushima brings nuclear up to 90 deaths per trillion kWhrs produced, still the lowest of any energy source.

The United States appears to be the country that is most concerned with windgen safety, as it boasts the lowest number for deaths, injuries and catastrophic mechanical failures of wind turbines in the world. Even so, there are annual protests regarding the relative safety.

So why do countries continue to invest? Possibly for the relatively low cost. Each industrial turbine costs $3 million and can generate up to $500,000 in energy revenue, so they can pay for themselves in 6-10 years and they generate power more consistently than solar. However, it has been shown the effective lifespan of a turbine is less than 15 years, which flies in the face of conventional wisdom that they will last 20 years. The annual cost of maintenance for modern turbines is 2 per cent of the cost, or $30,000 and the cost of replacement parts can be as much as $500,000 over a 10-year period, so the total cost of a typical windmill over 15 years is about $4 million. That comes out to about $2.40 per watt per year for a typical onshore windmill if absolutely nothing goes wrong.

Wind power will continue to be a source of energy for years to come, but only as long as we are willing to pay the premium financially and in human life.

[embed]https://youtu.be/8Hda3YGzli0[/embed]

Three grids, two not ready for alternative power

In our first two posts, we talked about how our reliance on turbine technology and carbon-fueled generation is not going to be going away anytime soon, even though it is inefficient. Our next few posts will be looking at the problem of power distribution. The growth of alternative energy technology has fueled a movement toward distributed generation over grid distribution, which is what we more commonly employ.  Grid power in the US comes mostly in direct current (DC) generation, resulting from Thomas Edison winning the debate over the benefits of DC power offer Nikola Tesla’s alternating current (AC). We are not going to get into the benefits of one over the other here but will, instead, talk about what is. We have a massive investment in most of the US in DC generation. The exceptions are most of the state of Texas, which has a unique grid system capable of handling both, as well as a small portion of southern Alaska and some of the northeastern states.

The US has, essentially, three separate energy grids. East, West and Texas

DC is easier to distribute power over long distances. All grid systems have a certain amount of electricity loss but AC tends to lose more power than DC over distance. The problem with alternative power and most of the grid in the US is that alternative power cannot be sent to where it is needed when it is needed. It must be localized or “boosted” along the line. With large grid-scale facilities, like many of the photovoltaic facilities being built in the Southwestern deserts, that becomes a significant issue.

Distributed generation is a concept becoming more popular because it fits better into the uses of alternative power generation. Quite simply, it means you generate the power adjacent to where it is needed. If you put solar panels or wind generators on your property, you are a distributed-generation facility. That causes significant problems for the grid.

First, your solar panels are producing AC power, similar to what your handheld device uses, but your home is set up for DC. You have to add an inverter to your home to change the AC into DC so it can be used in your home and placed on the grid. Second, as we mentioned previously, the power you produce isn’t necessarily when you or the grid actually needs the power, so the utilities cannot rely on the power you produce being available when it is needed. An AC grid would be able to better distribute your excess energy, as it does in Texas, but most of us are not on an AC grid so we have to make do with what we have. We have invested far too much in the infrastructure to rip it out and install a new one.

That brings us to the owners and stewards of the grid: Utility companies. Power generation has become a major headache for utilities that draws resources and money away from grid maintenance (in other words, the powerless and towers criss-crossing the country). There has been little investment in upgrading the grid because of it, but it remains a significant source of income for utilities, especially as the distributed power network grows. When it looked like they could make money off of people who generated power with alternative energy, by buying it cheaply and selling for a profit, they were more than willing to offer sweetheart deals to companies like SunPower, but as we have pointed out earlier, the profits have not been forthcoming and the deals, known as net-metering, are going away.

At the same time, some utilities are looking at getting out of the generation game altogether. PG&E in Northern California has made no secret that it is not only not building any new generation facilities, it is selling off what it does have to independent companies, with an eye to upgrading distribution infrastructure. The utility actually buys much of the power it delivers to customers from out-of-state power plants. While this practice makes financial sense for the utilities as well as for the integrity of the grid, it means higher power bills for customers including those who have already invested in alternative sources. The big losers in this paradigm shift will be the solar and wind companies that have relied on a steady steam of investment and revenue from the utilities.

Turbines are foundational to electrical power generation

By Lou CoveyEditorial Director

This is our second part on our series on the weaknesses of alternative power. In this installment, we look at the core of our generation technology, the turbine.

Entering into an evaluation of turbine technology we need to understand how important the technology is to production of electricity. What is going on in California is valuable to that understanding.

California Gov. Jerry Brown signs bill to combat climate change by increasing the state's renewable electricity use to 50 percent and doubling energy efficiency in existing buildings by 2030 at a ceremony Wednesday, Oct. 7, 2015. (AP Photo/Damian Dovarganes)

Governor Jerry Brown recently signed a law that says 33 percent of it power production must come from renewable sources, primarily through solar technology. The state has been lauded recently for advances in this effort and by 2014 total production was 20 percent of the total from renewables. However, that does not mean that 20 percent of the energy the state consumes is from renewables. In reality, California only produces 67 percent of all the energy it consumes, down from 90 percent in 1990.

California is increasingly dependent on power generation from other states, like Utah and Idaho, where the bulk of energy is produced by facilities that burn natural gas and coal to produce steam that drives their turbines. As a result, the power consumed in California is actually dirtier than it was 20 years ago. How this happens is an interesting shell game.

Let’s say you have a coal-fired generation plant producing 1 GW or power every day, 24 hours a day. This power can be used anywhere because the technology is easily distributed through the entire network, but it is “dirty” power because it producing carbon dioxide. You want to clean up the environment so you build a 500 MW solar farm to produce clean, renewable energy. But that farm only produces power, at best, 6 hours a day and can only be distributed in a very small area of the state. And then you shut down the coal-fired plant.

That gives you a net loss of 500 MW so you contract with a Utah utility to buy their excess power to make up the difference, and you have to actually buy more than that 500 MW because you also have to provide power during peak usage, that begins after 4 p.m., when the solar panels go off line. So the carbon dioxide produced by the Utah facility is equivalent to what was produced by your shuttered plant in California.

Here’s the good news, though, by shuttering the coal-fired plant, you have now increased the net amount of energy you produced by renewables by a 1.5 GWs even though you are producing 500 MW less. You now issue a press release saying you have dramatically increased the percentage of renewable power produced by the state… even though you’ve really done nothing for the environment. In fact, you may have made it worse.

Multistage steam turbine blades

Steam turbines have been generating electric power since the early 1900s. In 1903, Commonwealth Edison opened Fisk Generating Station in Chicago, using 32 Babcock and Wilcox boilers driving several GE Curtis turbines, at 5000 and 9000 kilowatts each, the largest turbine-generators in the world at that time. Almost all electric power generation, from the time of the Fisk Station to the present, is based on steam driven turbine-generators. The Fisk turbine was a single stage, with one set of blades, and could achieve a maximum theoretical efficiency of 33 percent but achieved much lower numbers.

Efficiency is determined by the amount of power converted from the steam to usable electricity. Turbine efficiency is determined by the number of blades, their design, the amount of turbulence behind each set of blades, friction and steam temperature. Over the years turbine efficiency has been improved to as much as 40 percent improved as additional stages were added. Again there are limitations even today are based on the quality of the water (seawater, alkaline water, etc.) and the quality of the blades.

However, turbines don’t produce electricity as soon as you flip a switch. The huge and expensive turbines must be gradually spun up (using electrical or mechanical power) before the steam can be gradually applied to heat up and expand the blades to operational levels. This process can take several hours. Utilities have to predict what the demand will be a half day before spinning up the turbines so when the solar arrays go off line at 4-5 p.m. the turbines are generating power enough to make up for the loss. Utilities are producing as much as 110 percent of maximum power from just after noon on day one until demand drops after 8 p.m. That means a significant amount of the power produced during solar peak production is actually wasted energy.

Whether the turbines are driven by steam, hydro or gas, the blades of the turbine need regular replacement and repair, depending on the quality of the working fluid or gas. Heat, contaminants and turbulence can weaken and warp the blades in a relatively short time requiring that the power plant be taken down in part or altogether. There are constant design advances to lessen downtime and increase output, but turbine technology remains inefficient and costly. As a result many utilities, like Pacific Gas and Electric, are divesting themselves of power generation to concentrate on power distribution alone. As more third-party companies take over generation, our ability to maintain a steady flow of power is endangered. If one company fails economically, will we have the ability to make up for the loss of their production?

Is alternative energy really an alternative?

Alternative energy is a huge industry generating and spending money at breathtaking speed. Governments have invested trillions of dollars in building out the industry infrastructure and thousands of private citizens have invested billions in private applications. And yet two-thirds of our electrical energy is generated by fossil-fuel burning technology, just as it was 20 years ago. With all the infrastructure established in the past two decades, the world's demand for energy has outstripped our ability to meet it with alternative power. 1200x-1It's time we admitted that alternative power is not an alternative. It is only a supplement. Once we admit that fact, we may be able to get to work actually finding a real alternative.

This series had its genesis more than 40 years ago with my first foray into investigative journalism; a seven-part series on alternative energy as it was in the 1970s. The conclusion of that original work was that alternative energy lacked the ability to meet the world's needs, much less what in wanted for energy. After 40 years of following the industry I have found it is not much different now. The technology, while more efficient and cheaper now, is still not sufficient to meet demand and probably never will on our current direction. Multiple reports predict that our energy usage will triple in the next 15 years. If “alternative” energy can’t keep up with the need today, how bad will things be in 15 years and beyond?

In this new series we will look at various sources of alternative power; the good, the bad, and the future of each technology, and it will conclude with a look at what might be possible today if only we look outside of the box we have created.

Today we set the stage for where we are.

There have been two contradictory articles in the Washington Post recently. The first stated that the cost of wind and solar have come down dramatically and are close to being on par with coal and oil generation in the cost per megawatt. The article indicates that with those dropping prices, it should make it easier to meet the demand for energy using alternative sources. What the article doesn't state is that the cost is largely achieved through government subsidy, most of which are going away soon, not just in the US, but everywhere in the world. Take away the subsidy and the price will skyrocket.

The second article, however, paints a very different picture. In 1990, two-thirds of all our power production came from coal, oil and natural gas generation plants. That was the beginning of the modern alternative energy industry as subsidies started growing. What also continued to grow was the world's demand for electricity, much of which is driven by the computing industry with always-on computers and data centers, the latter consuming 10 percent of all electricity generated. That demand has required additional generation from carbon-fuel technology to the point that after trillions of dollars in investment in alternative sources, coal, oil and gas still account for two thirds of all generation.

Much has been made of Europe's advances in alternative power. The Netherlands recently announced that their ocean-based wind farms delivered more than 100 precent of their power needs on one day this year and some countries are claiming that 50 percent of their daily needs are often provided by alternative power. What is not discussed is how those alternative sources inconsistent.

Wind produces power when the wind is blowing within a specific narrow range of speed. If the wind speed is too low the turbines don't turn. Too high and the turbine has to be stopped to keep the blades from warping due to the torque placed on them. Solar produces power for 6 hours a day at best, during the summer. That works out great for Spain which gets a lot of sun in the spring, fall and summer. It's not great for Sweden which gets virtually no sun for several months in the year. Bottom line: sun and wind are just not always available.

As a result, Europe is quietly buying coal from the United States so they can gear up their older power plants to provide electricity on a consistent basis. This is good for the US since its coal reserves rival Saudi Arabia's oil reserves. The coal industry has seen US demand drop and harsher regulations keep electricity production from coal severely limited, but at the same time, natural gas is enjoying a rapid increase in demand.

California has been crowing about the rapid increase of its alternative energy production and is predicting that 50 percent of all power produced in the state will be from alternative sources by 2030. That is completely likely as the state closes nuclear and carbon-fuel plants, but California currently imports 30 percent of its power from states producing energy surpluses from coal-burning plants. Part of the problem is that even in perfect conditions, some of the most touted technologies are not producing as expected.

For example, there is a massive facility in Ivanpah, California using acres of reflective panels to focus solar radiation on a single column placed in the center of the facility. The heat turns water to steam which in turn drives traditional turbines to produce electricity. The problem is that the facility is not producing power solely on solar power. They have had to bring in natural gas to supplement the heat source and get the facility up to its promised capacity. The problem lies in the turbines, which are rated at 33.3 percent theoretical efficiency, but in reality operate at 25 percent efficiency.

That brings in the issue of utilities that have the responsibility of meeting power demands from the population. The alternative energy industries promised the utilities that they would have a source of home-produced electricity by using the roofs of customers for solar and wind power. Over the past 10 years that source has proved to be wildly unpredictable and required the utilities to keep current, carbon-fuel plants spun up to 110 percent, just in case the solar/wind production drops off, which happens more often than not. There have been multiple lawsuits going back and forth across the country as utilities and private home owners find that the promises of the alternative energy companies cannot be met with current technologies. There is also an investigation underway by the US Department of the Treasury against several large alternative energy companies regarding over-valuing technology for tax purposes.

When all the facts are in view, the alternative energy industry, in fact, the entire energy sector is in serious disarray. There is hope, but only when we have a realistic view of what is is actually happening.

At the core of the difficulty will be the centerpiece of all energy conversion: the turbine. Turbines are used in traditional energy plants run on coal, oil and natural gas, but they are also used in hydroelectric, geothermal, solar concentration (like Ivanpah), tidal, wind and waste-heat conversion. Without a thorough rethinking of turbine design, we will be hard pressed to find a true alternative.

This series will look at all forms of energy production, from fossil-fuel to experimental concepts and everything in between. We will begin, next, with a look at the problem of turbines.

Sponsored by 3DP-international

Hacking by sound not that simple

A recent story from Reuters kicked up a bit of a stir by claiming that hackers could use sound to steal data from a computer or network. While the story was true it turns out that it isn't the whole story and is less of a problem than the report inferred, according to cybersecurity expert Jack Wolosewicz. Here's the interview.  

 

Two team up to solve heat problem in PV panels

Two of the oldest solar technology manufacturers, SolarWorld and FAFCO, have teamed up to deal with a significant weakness in solar PV panels: heat degradation.

The problems excessive heat causes the panels is significant. On a hot day, a panel can lose up 10-25 percent of its rated output and over time, consistent heat above 90 degrees Fahrenheit heat can actually cause permanent degradation of up to 30 percent in a couple of years. That information isn’t widely disseminated and few people, even those who sell the technology, knows the problem exists. That’s why when we were talking to PV vendors at the Intersolar Conference in San Francisco, the only people who could acknowledge the problem was at the FAFCO/SunWorld booth.

FAFCO has been providing passive solar water heating products and SunWorld PV panels since the 1970s. They have created several cogeneration systems over the years, but this is the first that marries the two technologies.

FAFCO has invented a heat exchange system known as CoolPV™ that is attached to a back of a typical 3x5 panel. Acting and looking very much like a radiator on a car, cold water is run through the system, drawing  heat off of the panel. Currently, the most common use for the water is heating spas and swimming pools, according to FAFCO president Bob Leckinger.

Leckinger preferred not to give the cost of the system, but claimed it could be recouped within three years. How you might buy the technology is another question.

Leckinger said they are selling product now, but looking on either the SunWorld or FAFCO websites finds no information available on the new systems. And since the companies are focused on recreational uses it’s not likely it will be available for the general public any time soon. That is unfortunate because there are a lot of solar panel farms literally burning up in the American Southwest.

See full interview here:

Energy Storage takes center stage at Intersolar

By Lou Covey, editorial director

The electrical energy storage industry continued to grow in credibility this week at the Intersolar 2015 conference with a co-located show in Moscone West. However, as a possible indicator that it is still a very small market, the Intersolar folks put the show name all in lowercase (ees).

The sector is set to see the installed base grow 250 percent by the end of 2015, year to year, according to GTM research , but according to other reports, that represents a total investment of $2.6 billion world wide. As a comparison, Solar energy installations represent an investment of $172 billion as of the end of last year. The industry has no where to go but up.

Showing a 10MW system at ees

There is no obvious leader rising in the ranks, except by general impression. Until this year the industry has done very little to distinguish itself until Elon Musk announced in May that Tesla will be offering home and industry storage products “real soon,” which was enough for lots of wealthy people that have electric cars and solar panels to put down a big chunk of cash to get their systems… sometime next year (A fool and his money…).

The reality is that the industry has been around for some time and selling products around the world relatively profitably, without a clear leader in the market. One would think that the attention being paid to the Tesla announcement might give them cause for jealousy, but that was not the case at ees. Every single company offering a storage system (and there were many) were practically salivating over their prospects.

“We are selling proven products with higher capacities and lower cost now than what Tesla says they are going to sell,” said Stefanie Kohl, marketing director of Sonnen-Batterie. “We made a decision to enter the US market early last year, and when Tesla made their announcement it was a nice gift to our marketing budget. Now everyone knows what it is and we can provide a better product for a better price." Being first to market is not always best.

Most companies offering storage products at ees called themselves a “market leader” for one reason or another, and Sonnen-Batterie calls itself “the German market leader.” It sold close to 4,000 units of its intelligent energy storage system to home owners, farmers and businesses since entering the German market in 2011. Germany has approximately 1.5 million solar installations currently and more coming every day, so Sonnen-Batterie has a way to go before they reach market saturation, but it seems a good start.

The investment community thinks so, too. Last December, Dutch and German investors sank put up $10 million to fund expansion.

The issue to be resolved is still cost per watt. Storage systems make sense for companies and residential applications when there is money to be spent. With solar installations producing power at $0.33 per watt, they are a pretty good deal over peak power costs from utilities, which is around $0.85 per war between noon and 6 p.m. But adding a storage system can make it a wash or even end up costing more.

So like all alternative energy technology, storage technology is still the realm of the wealthy. But it is a good start in the right direction.

Electric vehicles and hybrids: the science beyond the hype

Electric vehicles are all the rage today with politicians and pundits predicting mass adoption within the decade as a significant means to combat climate change. The reality, however, is not often reported and in a controversial presentation at the 52nd Design Automation Conference in San Francisco, Synopsys scientist Peter Groenevelt walked through the bare facts.

In his bottom line was a basic understanding that electric and hybrid vehicles have a place in society but might not be ready for worldwide adoption. In fact, if you don’t live in a Mediterranean climate and don’t live on flat ground, a conventional combustion engine may be your best choice.

We interviewed Mr. Groenevelt for a quick overview of his talk. If you’d like to receive a copy of his entire slide presentation, send a request to this link.  Here's the interview:

Secure collaboration is a quiet trend at #52DAC

By Lou CoveyEditorial Director

While outsourcing software and design development is a common practice, the idea of putting your company’s crown jewels into the cloud for a freelancer to monkey with tends to drive sales of anti-emetics. Can you safely allow virtual strangers to access your server, or should you just suck it up and overwork your employees?

That has been a continuous conundrum of the Electronic Design Automation Industry (EDA) and its customers in the embedded software and semiconductor industries. Larger companies, like Synopsys and Intel either use internal security paradigms in the collaborative tools or work with some of the big players, like IBM and OpenText. The costs of those tools however don’t always fit in the budget for smaller companies and can be a hindrance to outsourcing companies.

What makes the whole issue more difficult is that while companies readily admit is is an important issue, not many are actually willing to talk about what they are doing about it.

At the Design Automation Conference in San Francisco this week, there was a noticeable presence of companies stating they actually do provide for secure collaboration  and were more than willing to tell you who they provided it for. One of the main players, OpenText, customers proudly proclaims their list of customers, including, in the electronics world, Alcatel-Lucent, Cirrus Logic and Renesas (see interview here).

Other players, like the recently funded Zentera, not so much. We visited Zentera’s booth at the Design Automation Conference and they were quite adamant about not saying anything substantial on the record, but their website touts a lot of partners, including Microsoft and Qualcomm.

Then you get into the realm of the EDA tool providers and the walls go up quickly. Mentor Graphics expressed surprise that one of their major customers, Qualcomm, was working with Zentera to provide secure collaboration. Synopsys and Cadence claim their own “cloud” solution, consisting of private servers stuffed in their headquarters building.

Dassault Systeme, on the other hand, was quite effusive about their Enovia collaborative platforms and focuses security according to roles, geography and hierarchy. Dassault is relatively new to the world of semiconductor design and is making a strong effort to differentiate itself from the “holy trinity” of Synopsys, Mentor Graphics and Cadence, and they have been miles ahead of the EDA industry on the issue of collaboration and security, simply because of their much broader range of customers including the mil-aerospace niches that require a standardized approach.

For third-party providers of design services these secure collaboration platforms can open doors for working with the most cutting-edge technologies that are often strapped for resources. Customers that want to integrate design environments from multiple sources can use them to integrate the external design teams into an all encompassing environment without giving up those aforementioned crown jewels. If the customer doesn’t want the additional expense, it might be worth the investment by outsourcers to adopt the collaboration platforms and work the cost into their services overall.

Outsourcing has become a zero sum game with benefits for many

This is the latest in our ongoing series of articles on outsourcing, benefits and downfalls. By Lou Covey Editorial Director

Outsourcing product design and manufacturing has become an international way of life despite the concern that it takes jobs world_of_outsourcingaway from one country in favor of another. As the practice has matured, it has become more of a zero-sum game as long as the participant realize it is best as a cooperative exercise.

The decision to outsource any part of a product lifecycle is not longer a matter of which country a company will choose, but which countries to choose. High precision work is still the realm of the United States with Western Europe a close second. Mass production of mid-quality products is an acceptable choice, even though costs are starting to rise. And Central Europe is rising as the choice for high-quality, low-cost software design.

In the end, companies have a much greater choice in how and where they choose to put together their products and services and it tends to result in jobs all around the world.

We spent some time talking to George Slawek, the managing partner of the software outsourcing company Eurocal Group , which features management , customer relations and sales in the United States, combined with software developers in Poland. We found he sees business as not either/or. He says Poland offers options not available elsewhere, but are not the be-all and end-all or options. You can listen to the 10 minute discussion here.

http://www.spreaker.com/user/footwashermedia.com/outsourcing-has-benefits-for-all

(Full-disclosure: Footwasher Media provides consultation to Eurocal Group on content and marketing strategy)

Data breach fatigue and NIH fuel ineffective cyber security

Story.jpg

This is another part of our ongoing series on outsourcing services, again focusing on security  Large companies rely on the work of outsourcing providers for developing security solutions and containing breaches. By Lou CoveyNew Tech Press

News reports about data breaches are almost a daily occurrence. Companies spend millions on identity protection services for affected customers while the same type of breaches continue with no end in sight. The sheer volume of data stolen is astronomical begging the question, why isn’t anything being done?

“There is no financial reason for companies and governments to do anything about the problem because we have not seen any significant economic damage done to the companies or their customers,” said Anne Saunders of Eurocal Group, a U.S. software development company.

target_security_breachShe has a point. There have been relatively few people who have actually experienced  personal financial loss. For example, just last week, Target announced a settlement of $10 million for the breach that compromised the data of more than 100 million people -- 10 cents for each victim, not counting legal fees.

While the amount of data stolen has been massive and growing with each attack, the money spent on identity theft protection for and by those customers after any given attack is extremely low. The entire ID theft industry is currently only $3 billion annually with a projected growth rate of 0.5 percent and no measurable profit. The number of companies in the market make each slice of that pie very thin, so it’s not a business for the weak-hearted. The good news for that industry is corporations are adding budget for those purchases because, well, it’s relatively cheap.

Eurocal Group is one of many companies providing outsourcing services for companies around the world and they are finding a growing demand for the services with deep experience in cybersecurity. “If security isn’t a significant part of your development, whether it is embedded systems or web design, you’re just asking for trouble. Lucky for us, lots of companies have not been thinking about security,” Saunders said.

A battle weary market

Then there is the problem of breach fatigue.  The number of people affected by breaches is impossible to measure because of the interconnectedness of the data. One person might be affected by the Target, Anthem and Michael’s breaches and another might not be affected by any.  A recent report from Experian stated that 62 percent of consumers received at least two notifications of breaches in the past year.

Correlating to the Experian survey, market research firm Ipsos reported in December that 62 percent of consumers in the US are now concerned about the security of their data, which is an increase from 53 percent the previous year. However, 85 percent reported that they knew of no one whose data had been compromised and only 6 percent reported being the victim of a breach. So while there is growing concern, there is hardly a demand from the market to actually do something about it.

StoryWhich may be why some leading figures in the industry tell consumers they are pretty much on their own.  Herjavec Group Founder & CEO, Robert Herjavec discussed the recent and massive breach of Anthem in a recent interview with Fortune magazine. He stated that the integrated nature of health care systems requires consumers take responsibility for security. ”They must diligently check credit card records, and monitor their personal records with insurance and medical providers to mitigate the risks of credit card fraud and identify theft in the fall out of this breach.”

Don’t just do something, stand there

The U.S. government is also concerned about cybersecurity and is convening panels and study groups from Federal all the way to municipal levels. They have produced reams of legislation designed to deal with the issue, but there are two problems: 1. The legislation is designed more for show rather than actually deal with the real problems; 2. The legislation is designed to improve and control  government surveillance, rather than the security of voter data.

Better progress is being made in the European Union, especially in smaller countries in Central Europe. according to Jack Wolosewicz, CTO of cybersecurity tech startup, Certus Technology Systems. He said Europeans seem more open to security innovation than the US government and large corporations. "They tend to outsource to known companies, like RSA and Verisign, not because those are the best solutions but because, if there is a breach, they can say they went with the best known solutions. So no new ideas are considered.

Wolosewicz said the "CYA mindset" is the biggest barrier to adoption of effective security in large companies and enterprises which means smaller enterprises are more likely to be willing to look outside of the box.

“Financial services and internal corporate security is taken more seriously with big bucks being spent on 2nd factor authentication like RSA tokens,” he stated. "Expensive and outdated as they are, there is a market for that because relying on passwords alone is not a security strategy that anyone trusts any more. For mass markets, single-sign on is everywhere and browsers remember your passwords because it's easy for users, but it’s still passwords and that only increases risk."

Wolosewicz pointed out that Microsoft and Yahoo have launched initiatives to move away from passwords, so there is some movement in the right direction. "Mass markets are happy to pay for a better user experience to attract new users, but till now, better security meant worse user experience."

In the end, the major players that control what happens to the consumer data are not financially incentivized to change how things are done. Since their customers have pretty much accepted the status quo, any substantial change will have to come from non-traditional sources.

“We’ll take that business,” Saunders concluded.

Sony hacks may force companies to eliminate passwords

This article is the first of a year-long series of articles looking at outsourcing services and how they are no longer just a means of saving money.  We look today into the arena of cybersecurity and a startup using contract software design to create a new security paradigm.

By Lou Covey, Editorial Director

gty_computer_password_ll_131204_16x9_992The hack and subsequent terror threat of Sony Pictures laid bare the inherent weakness of cyber security in the world. Even the most powerful firewall technology is vulnerable to the person with the right user name and password (credentials).  In the case of Sony, the administration credentials were stolen through an unsophisticated phishing attack, allowing the hackers to bypass the Sony firewalls and storm the corporate castle.  This is the most common way hackers take down a system.

We have all heard stories of new technologies that overcome this basic flaw, from biometric technology to two-step verification, none of which seems is taking significant hold in the cyber world. According to Jack Wolosewicz, CTO and co-founder of Eurocal Group, corporations are reluctant to move beyond the familiar.  Articles in the Harvard Business Review and Fast Company lean toward agreeing with him.  Companies are dedicated to giving customers what they are willing to accept, not necessarily what they need, and they won’t force new paradigms on them.  But Wolosewicz says here is no such thing as a strong password.

“All passwords are weak because they are easily stolen and their complexity is irrelevant once a hacker has a copy of the password,” he explained. “This enables the hacker to masquerade as an administrator and, snap, the passwords, personal data and credit card numbers of millions of users are now in the criminal domain.”

However, Wolosewicz said, in the area of cybersecurity, that reluctance may give way to necessity. “We may be at the pain point where all of us are willing to look at something significantly different.”

Wolosewicz has a deep background in computer security and after working as CTO with the team at EuroCal Group, he realized he had the engineering resources to create a security system eliminating the password paradigm. And he could do it without the startup costs and headaches.  Certus was born.   Wolosewicz serves as the CTO of Certus, as well, managing the Eurocal engineering resources for both companies.

The Certus cryptographic protocol is based on a “one-time pad” cypher, proven unbreakable in 1945. The system creates a sonic digital handshake between a mobile phone and any device wishing to authenticate the user. If the phone is stolen or lost, the user just deactivates it. High security applications may be reinforced with 2nd factor authentication, so a lost cell phone in the wrong hands does not pose a threat.

“The Certus authentication system eliminates user credentials that can be separated from the user and misused in an attack,” Wolosewicz claimed. “It is significantly easier to use than two-factor verification and more reliable than biometrics. The cell phone has become an appendage for most of us and now it can become a universal key to the Internet. It’s keyless entry for the Web”. In payment systems applications, Certus never stores user credit card information, so even if a corporate system is somehow compromised, no credit card numbers or passwords can be stolen.

For the past few years, and going even further at this year’s CES, consumer electronic devices, from mobile phones to automobiles are filled with easily hacked technology, even if it isn’t currently activated.  There are already reports of smart TVs being used to harvest data on customers, without their knowledge, while they watch their favorite programs.  The rapidly growing popularity of streaming entertainment means a growing number of online accounts protected by the same user names and passwords for personal computing devices all of which makes individuals vulnerable to national cyber attacks.  For example, let’s say Sony does decide to release The Interview on streaming media.  It would be relatively easy right now for those same Korean hackers to collect the names and personal information of anyone who watches it.

We may have reached a pain point in electronic device security that goes so far beyond bandwidth, speed, latency, capacity and power usage it makes all those issues irrelevant to the current problem of security.

See part one of the interview.

This article sponsored by Blaylock Engineering, EuroCal Group, MeBox Media and Busivid.

Outsourcing 101: A new series

Outsourcing, both onshore and offshore, is here to stay and is very big business (link numbers), but it is not static.  Suppliers in Asia and the Indian subcontinent are still the leading players but significant resources are growing in the US and the European Union that are challenging that traditional hegemony. Outsourcing. Business Background.New Tech Press has been looking at this trend for the past few months and will begin publishing a series of articles and interviews beginning this month and running deep into 2015.  What has become clear is outsourcing falls into distinct groups:  Multinational enterprises providing soup-to-nuts services for large customers, foreign national organizations targeting US and Europe corporations and “blended” suppliers that feature local management with foreign-based resources.  The latter two often provide unique specialization in design and industrial niches, like security, automotive and web design.

There are also distinct divisions in the cost of these resources that range from expensive but necessary when customers lack internal resources but need high quality support, to very inexpensive when customers can fudge on quality, expertise and schedule.  The blended companies seem to span and straddle the differences.

Somewhat surprisingly, offshore resources located in the EU’s eastern most countries, Poland in particular, are demonstrating growth outstripping that of India and could soon reach parity.  Those countries, because of a closer relationship, culture and respect for intellectual property are becoming a favored source of higher end service once considered the exclusive domain of India.

Another clear trend is the return of importance of precision machining in the United States.  US based firms are finding that rising costs in personnel and shipping are negating offshoring benefits.  That fact, combined with the expected lower quality, environmental factors and the rising use of high-quality 3D printing is making US-based manufacturing highly desirable and profitable once again.

These are the aspects and trends New Tech Press will be looking at in the coming months.

The coverage is being underwritten by Eurocal Group, MeBox Media, Blaylock Engineering

Coin Guard: Home security for the tech impaired

Home security gadgets are all the rage, fueled in no small part by Dropcam and their competitors.  But video surveillance and smart locks have issues of data storage and hack-ability to deal with that scares most non-techie types.  Pilot Labs, a small OEM wireless product manufacturing company in San Diego, decided to leap into the fray with a product that brings wireless security to the masses who have a hard time figuring out fax machines. Coin Guard, currently awaiting funding through Kickstarter before it becomes widely available, hopefully in time for Christmas, Is a disk about two inches across that the user lays onto of something to protect.  If the disk is moved, it sends an alarm to a mobile phone.  So, if you can download an app to your phone, plug an ethernet cable into a router, and press a button you can have a security system installed in minutes.

We sat down with company co-founder Chris Thomas to get the skinny on the unusual product.  Check it out.

#ARMTechCon Offers up some nuggets of innovation

By Lou CoveyEditorial Director

Trade shows and conferences are becoming exercises in repetition and the loudest participants are generally the ones saying the same things as everyone else.  Once in a while, if you look hard, you can find a couple of products and services you might have missed in the noise.  And New Tech Press found three of them at the ARM TechCon 2014 in Santa Clara, CA this week.

The three companies were Undo, Toradex and Somnium and we stopped to talk to them about their offerings.  Undo, based in Cambridge int he UK, has found a novel way of speeding up the debug process.  Toradex, our of Seattle, Washington, makes a set of single-board computing modules that are very big in the University world,  Finally, Somnium, based in Wales, is in beta testing on software development environment for ARM® Cortex® embedded systems.  Here’s the video:

[embed]http://youtu.be/8wspyhIGCt4[/embed]

Me!Box upgrades video platform with 'scary' good features

New Tech Press has been using the video platform from Mebox Media, with great success for more than two years now and we were both concerned and intrigued when they let us know they were making a significant change this year. Would the upgrade be as easy to use? How will it affect the engagement level? And what about that scary new feature that lets you know who is watching? Yeah, that’s part of it.We sat down with CEO Mark Jacobs and COO Ken Fitchler to ask these questions and got some fascinating answers. Here's the interview: