We’ve added podcasting to our content program. It’s called Crucial Tech and we look at the technology that affects us all but we don’t always understand. We’ve just finished a series on Artificial Intelligence and how it relates to digital security. You can find it on Apple and Google podcasts, Spotify, Breaker, Anchor, Overcast, Pocketcasts and RadioPublic.
We haven't posted for almost a year because we've been working on a book on energy policy titled the same as this post. You can see all of the chapters on Medium. It is still in progress but coming close to the end. So catch up here.
by Lou Covey, Editorial Director
Every week or so I get a news story in my feed about how some country, or state or city is making incredible strides toward converting to 100 percent renewable energy use. Deep in my heart I want to believe that to be true, but after observing the renewable energy industry for more than 40 years I know in my head that it just ain't so.
Then today from Green Tech News, one of the major rah-rah sites for renewables came a very sobering bit of reality. We are not making the progress we think we are. The article gives many valid reasons for this but continues to ignore some of the deepest reason: That we are really not making any progress on innovation in green energy technology. We need to start admitting we are gong nowhere and need to change how we think about energy conversion.
(Note: all energy sources are merely conversions of one form of energy to electricity, be they mechanical (turbines) or physical reactions (solar). There is nothing else.)
Solar, wind, hydro, geothermal and tidal sources are wonderful supplements to our energy production are still based on technologies developed around a century ago with only incremental advances in those technologies. Despite trillions (yes, trillions) of dollars in private and government funding, these technologies are still the most inefficient forms of energy conversion we have.
By comparison, internal combustion engines have evolved by leaps and bounds over renewables, which is why they remain the most popular and cheapest form of energy conversion, and they are only 33 percent efficient at best (in other words, they only extract a third of the potential energy from fossil fuel). The automotive industry has done more for reducing climate change than any other, even as they hold the greater blame for it. You might argue with the first part, but from a strictly scientific/engineering view, it is inarguable.
And the catastrophic damage to the planet, as I wrote about last year, may be greater than the damage done to atmosphere. We haven't seen a fraction of what that potential will be, but it is coming soon.
Should we give up? No, of course not. But we need to change our thinking and realize we have not yet found the answer. We need to stop bowing down to the false prophets of climate change (Elon Musk and Al Gore are two of them) and realize that our public funding of inefficient conversion is not helping. We need our government to start funding wild-eyed "maniacs" who are trying to do something different.
One place to start is in the area of iron-flow aqueous battery technology. It is more reliable, lasts longer, is completely recyclable and has zero environmental impact compared to the more popular and incredibly toxic lithium-ion technology (that powers mobile devices, electric and hybrid cars, and cameras, and will soon be rolled out in solar powered homes and businesses across the country).
There is technology that can transform our societies power conversion in the world, but it isn't like anything we have seen before. We need to stop throwing money and pipe dreams and start thinking differently.
Researchers at security firm Check Point last week discovered a second, widespread malware campaign on Google’s app store dubbed “Judy”. The auto-clicking adware was found on 41 apps developed by a Korean company and uses infected devices to generate large amounts of fraudulent clicks on advertisements, generating revenues for the perpetrators behind it. By the time it was discovered it had been downloaded on up to 18.5 million mobile devices. the malicious code may have been on the devices for years.
Which makes a good introduction to the second video in our sponsored series on Designing for Security: What's in your App?
Video: Designing for security: What's in your App?
The first systems targeted by the ransomware attack were the National Health Services in the United Kingdom and were followed by systems in 150 countries worldwide. More attacks are expected this week until patches can be applied to the current systems. The security hole was identified as a weakness in Microsoft XP-based systems, originally discovered by the US. National Security Administration and revealed in a data dump stolen from the NSA last year. Here’s where it gets interesting.
Microsoft stopped supporting XP code in 2014, and worldwide legacy systems had not updated to newer operating systems. But when the NSA’s “backdoor” was revealed, Microsoft began working on a patch that was distributed to registered users one month ago.
We have reported over the past two years that the biggest problem in cybersecurity is the people just don’t pay attention to their security, even when they are given the tools to make their data secure. Eventually apathy catches up to you.
That’s why we have partnered with Eurocal Group, a software development organization headquartered in San Francisco, to provide free data security evaluations for companies big and small. We’ve created a series of short informational videos covering what you may be missing if you are generally clueless and apathetic about what it takes to be secure with specific focus on legacy systems, big data, mobile app development, and cloud-based system. The introductory video is presented here, today.
Video: Designing for security: What are you missing?
Sponsored by Eurocal Group
The conclusion of a year-long New Tech Press investigation of the employment industry shows is it is failing job seekers, employers and the investment community. Some of these services can point to a few areas of success and some are better than others, none are particularly effective. In fact, if a job seeker were to place on roulette bet or buy one lottery ticket for every job they apply for on a job listing service, they would be more likely to win a living wage from gambling than job searches. Employers and the employment industry rely on the Standard Occupation Codes (SOC) and North American Industry Classification System (NAICS) to form the foundation of job descriptions. In our interviews with HR professionals and system developers we discovered that when they are creating job descriptions they consistently overuse and misuse the coding system to pull in the most candidates, even those that are unqualified.
Second, the “artificial intelligence” built into the systems is not much more than a scan of those codes and word search functions, resulting in bad returns.
In making this discovery we input the following job functions into a dozen job-finding sites (e.g. Monster.com) and employment sites (e.g. Facebook)
- Marketing communications
- Marketing manager
- Audio design
- Video editing
- Program management
The results produced, at most, no more than two positions that were actually for those jobs and hundreds of positions that ranged from electronic system designs to sous chefs. The most egregious example is a standard search for audio designer jobs. This particular job is crucial in television, movie, and video game industries and is the key focus for the broadcast and electronic communications major at San Francisco State University. Using that job title in a search for jobs in a dozen job finding sites (e.g. Monster.com) resulted in more than 5,000 positions for electrical engineers and computer scientists and not a single job for an actual audio designer.
To see why we received these results we dug into to SOC and NAICS codes appended to the job offering and found dozens in codes that had absolutely nothing to with the listed jobs. The reason for these errors was simple: When a company is looking for employees they input the codes related specifically to their industry or discipline. When a tech company, like Google, is looking for an engineer to design an audio codec. They append the NAICS 541400 code for specialized audio systems design to the posting. The systems and HR professionals do word searches for “audio” and “design” and make the posting. So an audio design engineer who produces sound, get thousands of job listings for semiconductor designers who have experience in audio codecs.
Neither the NAICS nor the SOC systems are adequate sources of data for effective job placement efforts, yet these codes are foundational to all job postings.
The next problem is human fallibility.
In spite of the flood of automated job sites and technology, all of which we have found to be horribly flawed, the HR industry is dependent on humans that are deluged by unqualified applicants fed to them by the flawed technology. Thos professionals are well versed in the legal requirements of their profession but lack basic tech understanding and and under use relevant technology.
We talked to 27 in-house and independent recruiters over the past year, some of them senior HR managers and vice presidents. None of them knew the capabilities of the search technology, much less the full capabilities of artificial intelligence. Many of them were flummoxed about the use of simple spread-sheet tools. As a result they do not use the basic automation tools available even for free and are generally overwhelmed by the amount of communication they have to handle daily from employers and potential employees. One manager for a major marketing automation system company did not use her own company’s technology for job openings. In another case, a recruiter for a social media company did not even have a profile on the company platform.
The combination of bad and misused data, and lack of basic tech understanding results in an ineffective mechanism for matching qualified people with jobs and, hence, the high number of qualified people leaving the job market altogether.
As frustrating as this may be for employers and job-seekers, it must be even more frustrating for the investment community. More than $4 billion has been invested in HR tech startups in the past two years and there is no end in sight. Companies and seekers are paying subscriptions to these companies in the millions of dollars and yet with little positive results. When combining the current unemployment rate with the number of people who have left the job market, the effective unemployment rate in the US is 40 percent, with no relief in sight.
What can be done? Start with the data.
The SOC and NAICS codes were not intended to help people find jobs. They were designed to classify maintain records of employment for the purpose of population studies and taxation. Using them for job openings is a gross misuse of the data. The HR industry needs to employ experienced communicators and data scientists to develop and sort job opening data. The communicators can create accurate and realistic offerings and the data scientists, using sentiment analysis, can identify and process the recruits more effectively.
Secondly, marketing automation and proper SEO can attract, sort and communicate effectively with applicants making HR professionals more efficient and productive by eliminating frustration and wasted effort.
Finally, employers need to have an attitude adjustment regarding what they are looking for in potential employees. As the unemployment rate falls, it will be more difficult for overly selective companies to find productive employees. Instead of investing in tech based on bad data, invest in training of HR staff to do better work.
By Lou CoveyEditorial Director
This is part two of a two-part series. See part one here.
As the renewable energy industry rushes headlong into the distribution of expensive and toxic energy storage technology, there are signs that cheaper, cleaner, more efficient technologies will emerge soon, two of which were on display at Intersolar.
Flow batteries have been around for several decades, but the clamor for storage systems has brought them out of the labs and into commercial development of late. A flow battery is combination of a fuels cell and a rechargeable battery. The difference between conventional batteries and flow batteries is that energy is stored in the electrode material in conventional batteries but as the electrolyte in flow cells. Rechargeability is provided by two chemical components dissolved in liquids contained within the system and separated by a membrane. One of the biggest advantages of flow batteries is that they can be almost instantly recharged by replacing the electrolyte liquid, while simultaneously recovering the spent material for re-energization.
Iron-flow technology is among the oldest version of the technology and there are several companies in this niche at varying states of development, the most well known is Electric Fuel Energy (EFE), an Israeli subsidiary of the US-based Arotech Corporation. However, while EFE has made several optimistic announcements about its development, it has not yet announced when the technology will be available even for a beta test.
Further down the road to reality is Energy Storage Systems, Inc. (ESS), who demonstrated their system at a private event during Intersolar. We interviewed Bill Sproull, ESS vice president of business development.
Video: ESS Iron-flow battery
Flow batteries are the cleanest and least expensive battery technology available, but they are not as well-known as the dirtier and more expensive Lithium-ion batteries. ESS hopes to change that with their iron-based system
Offering a completely new direction, Aquion Energy has developed an aqueous hybrid ion battery based on readily available, renewable and non-toxic components, specifically salt water, stainless steel, carbon, manganese and cotton.
Like lithium-ion and lead acid batteries, after 10 years it will start to lose capacity and will have to be replaced before the generation technology runs its course. However, at a cost of about $400 per kW/Hr it is a much cleaner and affordable option to the pervasive technologies. Storage capacity and charging cycles are comparable to lithium-ion.
Aquion, however, is the only source of this new technology and has no plans to license the technology presently, which will limit adoption in the foreseeable. It is, also, only in beta testing at a few sites around the world. It will be several years before it is widely available.
Aquion and ESS are closer to commercial distribution than other alternatives to lead-acid and lithium-ion, and both are targeting separate, though slightly overlapping markets. ESS is looking only to provide systems for large-scale industrial and military applications, while Aquion is looking at smaller industrial/commercial and residential markets. Aqiuon, therefore, is targeting a market awash in lithium-ion options.
The cost difference is also significant. While the initial cost of both systems about $400 per kW/Hr, the Aquion system will lose its effectiveness before energy production technology (wind or solar) reaches its end of life and will have to be entirely replaced. Replacement could add another 50 percent to the cost per watt of $0.17 per watt. ESS, with a much longer lifespan estimates its lifetime cost is $0.09 per watt with no replacement necessary.
Which brings us to the question: if all this is available now, why is it not gaining faster traction? We will deal with that question in our concluding analysis.
By Lou Covey, Editorial Director
Judging from the number of companies that exhibited at the Intersolar/EES joint conference in San Francisco, energy storage is the next big thing in green energy. It resolves the problem of intermittent over-and under-production that plagues sources like solar and wind. It comes, however, at a high cost to the pocketbook, the environment and personal safety. The good news is that there are alternatives to conventional storage technology. The bad news is those technologies are only now coming to market and are facing an uphill battle with the technology status quo.
First, let's look at the most popular technologies dealing with storage and why storage is even necessary in the first place.
Wind and solar are the most popular sources of green energy today, but they are also two of the most inconsistent, inefficient and costly sources of electricity. Wind only produces power when the wind is blowing and only at a narrow range of wind speed. If the wind blows too fast or too slow, your wind turbine becomes, essentially, useless. Solar produces power most effectively between the hours of 11 a.m. and 4 p.m. which just happens to be the same off-peak hours when less electricity is needed. Production from solar panels disappears during peak demand times. Adding a storage technology allows systems to save up electricity during peak production times providing "clean" power during peak demand. Because most systems are designed to store between 48 hours to a week of electricity without recharge, they also serve as a valuable source of energy when the wind doesn't blow or the sun doesn't shine.
The two most common forms of storage technology are lead-acid batteries, similar but larger than the kind of batteries in internal combustion cars, and lithium-ion (Li-ion) batteries, like those used in electric and hybrid cars and mobile devices. The average lifespan for these two popular technologies is about 10 years. That's where cost comes in as a significant factor.
Since windmills and solar have a stated maximum-generation lifespan of 20 years it means the batteries must be replaced or enhanced at least once during their lifetime (just like you do in any other device) long before the generators die. Both technologies require significant and sophisticated hardware and software technology to charge, maintain and manage the flow of power efficiently and safely. When you combine all that together you get a cost of $1000-$2000 per kilowatt/hour. The average system needs to hold a minimum of two days worth of power, so a 5 kW/hr solar storage product will cost between $10,000 and $20,000 for 10 kW of storage. That's added on top of the $30,000 the energy system cost, and, as stated, it will have to be completely replaced at least once during the life of the energy system. The minimal cost for energy storage, then, is more than half of the cost of the energy system. Even with subsidies that puts the cost of green energy out of the hands of many residential and commercial users. Storage systems have other significant costs that should not be overlooked.
Now let's consider the environmental costs.
Stationary lead-acid battery system (SLABS) have a significant list of environmental compliance, enforcement, and liability concerns because the active ingredients are sulfuric acid and lead -- two of the most toxic substances on earth. A spill may result from the improper handling of hazardous material discharge or a slow and undetected corrosive breach in the battery housing that can cause injurious, if not lethal exposure to employees, workers, or tenants. Overcharging a battery, due to a failure in the software control system, can result in the release of hydrogen sulfide gas, a colorless, poisonous, flammable substance that smells like rotten eggs.
Lead is highly toxic metal and once the battery becomes inoperative, it must be properly collected and recycled. A single lead-acid battery disposed of incorrectly into a solid waste collection system, and not removed prior to entering a resource recovery facility for mixed waste, could contaminate 25 tonnes of waste and prevent the recovery of the organic resources because of high lead levels.
Sixty-four percent of all the lead produced through mining goes into lead-acid batteries and the harmful effects of improper recycling are giving (even in third-world countries where regulations are less stringent) pause in using them from energy storage. The government of India is mandating the elimination of the technology for anything but automotive use. The reason for the ban is that lead from storage batteries placed in unlined landfills is contaminating groundwater.
Li-ion batteries have advantages over lead-acid in that they are smaller and lighter making them preferable to electric cars and mobile devices, but they are in their relative infancy in large-scale use.
“We are at the very beginning in energy storage in general,” says Phil Hermann, chief energy engineer at Panasonic Eco Solutions. “Most of the projects currently going on are either demo projects or learning experiences for the utilities. There is very little direct commercial stuff going on."
Moreover, Li-ion is highly unstable and without proper power management, usually via a software solution, they tend to burst into flame (e.g. the hoverboard). Every vendor we talked to at the EES conference dismissed safety concerns, essentially saying that the problem exists with other vendors, not them.
One company, a start up called ElectrIQ Power, provides a turnkey system in a box including the batteries, inverter and control software, which is touted as their claim to safety superiority. Like all other Li-ion system suppliers the warranty for their systems is for 10 years. At that point the battery capacity is 60% of what it was when new, requiring the purchase of a new system or an additional 5 kWh booster pack at the end of the 10 year period.
However, at present they have no plans on helping customers deal with the disposal of the batteries at the end of their useful life. No other Li-ion vendor could answer questions about eventual disposal either.
Video: ElectrIQ Power simplifies home energy storage
Here's an interview with the founder of ElectrIQ Power:
Electric Power is a software company integrating and managing multiple energy storage technologies into a single unit. The system is based on standard Lithium-ion battery technology with hybrid investors.
But beyond safety, the environmental issues facing the production of LI-ion is most troubling. An EPA 2013 report concluded that batteries using lithium, nickel and cobalt, have the “highest potential for environmental impacts”. It cited negative consequences like mining, global warming, environmental pollution and human health impacts.
Take, for example, the Tesla factory near Reno, Nevada. As Nevada is the only source of elemental lithium in the United States locating the factory in that state was an obvious choice and the Nevada government has always been open to toxic industries. Elemental lithium is flammable and very reactive. In nature, lithium occurs in compounded forms such as lithium carbonate requiring chemical processing to be made usable. Typically found in salt flats in areas where water is scarce, the mining process of lithium uses large amounts of water. Toxic chemicals are used for leaching purposes, chemicals requiring waste treatment. There are widespread concerns of improper handling and spills, like in other mining operations around the world. Even in first world countries, Li-ion battery recycling is in the single digit percent range. Most batteries end up in landfill.
Finally, we have nickel-cadmium batteries (NiCd). This is a very old technology that has been in in commercial production since the 1910s. While not as expensive as Li-ion and have more recharging cycles, they are bulkier, have lower power densities, and must be completely discharged before recharging, They also survive longer then Li-ion and have not been know to explode when overcharged.
Ni-Cd batteries have been used early energy-storage applications. For example the 27 Megawatt wind farm operated by the Golden Valley Electric Association In Alaska has used a 3 Megawatt Ni-Cd system stabilization on the island of Bonaire since 2010. But their lack of density and the need to completely discharge power before recharge has made them less valuable to the renewable energy industry. NiCd mining and production is just as toxic as Li-ion and recycling is so toxic they have been banned in the European Union.
So, when it comes to adopting the most popular forms of energy storage in the world the question the market needs to answer is, "how much do we want to damage the environment for it?"
There are alternatives, though. We look at those next.
This is the next chapter of our series on energy production. We take a look at wind power, it's history, application and challenges. The first time wind power was put to use was in the sails of boats and for more than two millennia wind-powered machines have been a cheap source of power for food production and moving water. It was widely available, was not confined to the banks of fast-flowing streams, and required no fuel. The Netherlands used wind-powered pumps to push back the sea and wind pumps provided water for livestock and steam engines for over a century.
With the development of electric power, wind power found new applications in lighting buildings remotely from centrally generated power, birthing the concept of distributive power systems. Starting in the 20th century saw wind plants for farms or residences and larger utility-scale wind generators that could be connected to electricity grids for remote use of power.
By 2014, over 240,000 commercial-sized wind turbines were operating in the world, producing 4% of the world's electricity. Today we hear news about wind turbines delivering almost all the energy needs for countries like the Netherlands and Germany... for one or two days a year.
What they don’t report as often is the failure rate of those turbines and the loss of life associated with them.
Approximately 120 wind turbines catch fire every year in the UK alone, according to a joint 2014 engineering study at Imperial College London and the University of Edinburgh. Beyond fire there are multiple accidents that don’t result in system failure but do result in the death of engineers servicing the systems. In England, there were 163 wind turbine accidents that killed 14 people in 2011. Wind produced about 15 billion kWhrs that year, so using a capacity factor of 25%, that translates to about 1,000 deaths per trillion kWhrs produced (the world produces 15 trillion kWhrs per year from all sources). Even using the worst-case scenarios from Chernobyl and Fukushima brings nuclear up to 90 deaths per trillion kWhrs produced, still the lowest of any energy source.
The United States appears to be the country that is most concerned with windgen safety, as it boasts the lowest number for deaths, injuries and catastrophic mechanical failures of wind turbines in the world. Even so, there are annual protests regarding the relative safety.
So why do countries continue to invest? Possibly for the relatively low cost. Each industrial turbine costs $3 million and can generate up to $500,000 in energy revenue, so they can pay for themselves in 6-10 years and they generate power more consistently than solar. However, it has been shown the effective lifespan of a turbine is less than 15 years, which flies in the face of conventional wisdom that they will last 20 years. The annual cost of maintenance for modern turbines is 2 per cent of the cost, or $30,000 and the cost of replacement parts can be as much as $500,000 over a 10-year period, so the total cost of a typical windmill over 15 years is about $4 million. That comes out to about $2.40 per watt per year for a typical onshore windmill if absolutely nothing goes wrong.
Wind power will continue to be a source of energy for years to come, but only as long as we are willing to pay the premium financially and in human life.
In our first two posts, we talked about how our reliance on turbine technology and carbon-fueled generation is not going to be going away anytime soon, even though it is inefficient. Our next few posts will be looking at the problem of power distribution. The growth of alternative energy technology has fueled a movement toward distributed generation over grid distribution, which is what we more commonly employ. Grid power in the US comes mostly in direct current (DC) generation, resulting from Thomas Edison winning the debate over the benefits of DC power offer Nikola Tesla’s alternating current (AC). We are not going to get into the benefits of one over the other here but will, instead, talk about what is. We have a massive investment in most of the US in DC generation. The exceptions are most of the state of Texas, which has a unique grid system capable of handling both, as well as a small portion of southern Alaska and some of the northeastern states.
DC is easier to distribute power over long distances. All grid systems have a certain amount of electricity loss but AC tends to lose more power than DC over distance. The problem with alternative power and most of the grid in the US is that alternative power cannot be sent to where it is needed when it is needed. It must be localized or “boosted” along the line. With large grid-scale facilities, like many of the photovoltaic facilities being built in the Southwestern deserts, that becomes a significant issue.
Distributed generation is a concept becoming more popular because it fits better into the uses of alternative power generation. Quite simply, it means you generate the power adjacent to where it is needed. If you put solar panels or wind generators on your property, you are a distributed-generation facility. That causes significant problems for the grid.
First, your solar panels are producing AC power, similar to what your handheld device uses, but your home is set up for DC. You have to add an inverter to your home to change the AC into DC so it can be used in your home and placed on the grid. Second, as we mentioned previously, the power you produce isn’t necessarily when you or the grid actually needs the power, so the utilities cannot rely on the power you produce being available when it is needed. An AC grid would be able to better distribute your excess energy, as it does in Texas, but most of us are not on an AC grid so we have to make do with what we have. We have invested far too much in the infrastructure to rip it out and install a new one.
That brings us to the owners and stewards of the grid: Utility companies. Power generation has become a major headache for utilities that draws resources and money away from grid maintenance (in other words, the powerless and towers criss-crossing the country). There has been little investment in upgrading the grid because of it, but it remains a significant source of income for utilities, especially as the distributed power network grows. When it looked like they could make money off of people who generated power with alternative energy, by buying it cheaply and selling for a profit, they were more than willing to offer sweetheart deals to companies like SunPower, but as we have pointed out earlier, the profits have not been forthcoming and the deals, known as net-metering, are going away.
At the same time, some utilities are looking at getting out of the generation game altogether. PG&E in Northern California has made no secret that it is not only not building any new generation facilities, it is selling off what it does have to independent companies, with an eye to upgrading distribution infrastructure. The utility actually buys much of the power it delivers to customers from out-of-state power plants. While this practice makes financial sense for the utilities as well as for the integrity of the grid, it means higher power bills for customers including those who have already invested in alternative sources. The big losers in this paradigm shift will be the solar and wind companies that have relied on a steady steam of investment and revenue from the utilities.
By Lou CoveyEditorial Director
This is our second part on our series on the weaknesses of alternative power. In this installment, we look at the core of our generation technology, the turbine.
Entering into an evaluation of turbine technology we need to understand how important the technology is to production of electricity. What is going on in California is valuable to that understanding.
Governor Jerry Brown recently signed a law that says 33 percent of it power production must come from renewable sources, primarily through solar technology. The state has been lauded recently for advances in this effort and by 2014 total production was 20 percent of the total from renewables. However, that does not mean that 20 percent of the energy the state consumes is from renewables. In reality, California only produces 67 percent of all the energy it consumes, down from 90 percent in 1990.
California is increasingly dependent on power generation from other states, like Utah and Idaho, where the bulk of energy is produced by facilities that burn natural gas and coal to produce steam that drives their turbines. As a result, the power consumed in California is actually dirtier than it was 20 years ago. How this happens is an interesting shell game.
Let’s say you have a coal-fired generation plant producing 1 GW or power every day, 24 hours a day. This power can be used anywhere because the technology is easily distributed through the entire network, but it is “dirty” power because it producing carbon dioxide. You want to clean up the environment so you build a 500 MW solar farm to produce clean, renewable energy. But that farm only produces power, at best, 6 hours a day and can only be distributed in a very small area of the state. And then you shut down the coal-fired plant.
That gives you a net loss of 500 MW so you contract with a Utah utility to buy their excess power to make up the difference, and you have to actually buy more than that 500 MW because you also have to provide power during peak usage, that begins after 4 p.m., when the solar panels go off line. So the carbon dioxide produced by the Utah facility is equivalent to what was produced by your shuttered plant in California.
Here’s the good news, though, by shuttering the coal-fired plant, you have now increased the net amount of energy you produced by renewables by a 1.5 GWs even though you are producing 500 MW less. You now issue a press release saying you have dramatically increased the percentage of renewable power produced by the state… even though you’ve really done nothing for the environment. In fact, you may have made it worse.
Steam turbines have been generating electric power since the early 1900s. In 1903, Commonwealth Edison opened Fisk Generating Station in Chicago, using 32 Babcock and Wilcox boilers driving several GE Curtis turbines, at 5000 and 9000 kilowatts each, the largest turbine-generators in the world at that time. Almost all electric power generation, from the time of the Fisk Station to the present, is based on steam driven turbine-generators. The Fisk turbine was a single stage, with one set of blades, and could achieve a maximum theoretical efficiency of 33 percent but achieved much lower numbers.
Efficiency is determined by the amount of power converted from the steam to usable electricity. Turbine efficiency is determined by the number of blades, their design, the amount of turbulence behind each set of blades, friction and steam temperature. Over the years turbine efficiency has been improved to as much as 40 percent improved as additional stages were added. Again there are limitations even today are based on the quality of the water (seawater, alkaline water, etc.) and the quality of the blades.
However, turbines don’t produce electricity as soon as you flip a switch. The huge and expensive turbines must be gradually spun up (using electrical or mechanical power) before the steam can be gradually applied to heat up and expand the blades to operational levels. This process can take several hours. Utilities have to predict what the demand will be a half day before spinning up the turbines so when the solar arrays go off line at 4-5 p.m. the turbines are generating power enough to make up for the loss. Utilities are producing as much as 110 percent of maximum power from just after noon on day one until demand drops after 8 p.m. That means a significant amount of the power produced during solar peak production is actually wasted energy.
Whether the turbines are driven by steam, hydro or gas, the blades of the turbine need regular replacement and repair, depending on the quality of the working fluid or gas. Heat, contaminants and turbulence can weaken and warp the blades in a relatively short time requiring that the power plant be taken down in part or altogether. There are constant design advances to lessen downtime and increase output, but turbine technology remains inefficient and costly. As a result many utilities, like Pacific Gas and Electric, are divesting themselves of power generation to concentrate on power distribution alone. As more third-party companies take over generation, our ability to maintain a steady flow of power is endangered. If one company fails economically, will we have the ability to make up for the loss of their production?
Alternative energy is a huge industry generating and spending money at breathtaking speed. Governments have invested trillions of dollars in building out the industry infrastructure and thousands of private citizens have invested billions in private applications. And yet two-thirds of our electrical energy is generated by fossil-fuel burning technology, just as it was 20 years ago. With all the infrastructure established in the past two decades, the world's demand for energy has outstripped our ability to meet it with alternative power. It's time we admitted that alternative power is not an alternative. It is only a supplement. Once we admit that fact, we may be able to get to work actually finding a real alternative.
This series had its genesis more than 40 years ago with my first foray into investigative journalism; a seven-part series on alternative energy as it was in the 1970s. The conclusion of that original work was that alternative energy lacked the ability to meet the world's needs, much less what in wanted for energy. After 40 years of following the industry I have found it is not much different now. The technology, while more efficient and cheaper now, is still not sufficient to meet demand and probably never will on our current direction. Multiple reports predict that our energy usage will triple in the next 15 years. If “alternative” energy can’t keep up with the need today, how bad will things be in 15 years and beyond?
In this new series we will look at various sources of alternative power; the good, the bad, and the future of each technology, and it will conclude with a look at what might be possible today if only we look outside of the box we have created.
Today we set the stage for where we are.
There have been two contradictory articles in the Washington Post recently. The first stated that the cost of wind and solar have come down dramatically and are close to being on par with coal and oil generation in the cost per megawatt. The article indicates that with those dropping prices, it should make it easier to meet the demand for energy using alternative sources. What the article doesn't state is that the cost is largely achieved through government subsidy, most of which are going away soon, not just in the US, but everywhere in the world. Take away the subsidy and the price will skyrocket.
The second article, however, paints a very different picture. In 1990, two-thirds of all our power production came from coal, oil and natural gas generation plants. That was the beginning of the modern alternative energy industry as subsidies started growing. What also continued to grow was the world's demand for electricity, much of which is driven by the computing industry with always-on computers and data centers, the latter consuming 10 percent of all electricity generated. That demand has required additional generation from carbon-fuel technology to the point that after trillions of dollars in investment in alternative sources, coal, oil and gas still account for two thirds of all generation.
Much has been made of Europe's advances in alternative power. The Netherlands recently announced that their ocean-based wind farms delivered more than 100 precent of their power needs on one day this year and some countries are claiming that 50 percent of their daily needs are often provided by alternative power. What is not discussed is how those alternative sources inconsistent.
Wind produces power when the wind is blowing within a specific narrow range of speed. If the wind speed is too low the turbines don't turn. Too high and the turbine has to be stopped to keep the blades from warping due to the torque placed on them. Solar produces power for 6 hours a day at best, during the summer. That works out great for Spain which gets a lot of sun in the spring, fall and summer. It's not great for Sweden which gets virtually no sun for several months in the year. Bottom line: sun and wind are just not always available.
As a result, Europe is quietly buying coal from the United States so they can gear up their older power plants to provide electricity on a consistent basis. This is good for the US since its coal reserves rival Saudi Arabia's oil reserves. The coal industry has seen US demand drop and harsher regulations keep electricity production from coal severely limited, but at the same time, natural gas is enjoying a rapid increase in demand.
California has been crowing about the rapid increase of its alternative energy production and is predicting that 50 percent of all power produced in the state will be from alternative sources by 2030. That is completely likely as the state closes nuclear and carbon-fuel plants, but California currently imports 30 percent of its power from states producing energy surpluses from coal-burning plants. Part of the problem is that even in perfect conditions, some of the most touted technologies are not producing as expected.
For example, there is a massive facility in Ivanpah, California using acres of reflective panels to focus solar radiation on a single column placed in the center of the facility. The heat turns water to steam which in turn drives traditional turbines to produce electricity. The problem is that the facility is not producing power solely on solar power. They have had to bring in natural gas to supplement the heat source and get the facility up to its promised capacity. The problem lies in the turbines, which are rated at 33.3 percent theoretical efficiency, but in reality operate at 25 percent efficiency.
That brings in the issue of utilities that have the responsibility of meeting power demands from the population. The alternative energy industries promised the utilities that they would have a source of home-produced electricity by using the roofs of customers for solar and wind power. Over the past 10 years that source has proved to be wildly unpredictable and required the utilities to keep current, carbon-fuel plants spun up to 110 percent, just in case the solar/wind production drops off, which happens more often than not. There have been multiple lawsuits going back and forth across the country as utilities and private home owners find that the promises of the alternative energy companies cannot be met with current technologies. There is also an investigation underway by the US Department of the Treasury against several large alternative energy companies regarding over-valuing technology for tax purposes.
When all the facts are in view, the alternative energy industry, in fact, the entire energy sector is in serious disarray. There is hope, but only when we have a realistic view of what is is actually happening.
At the core of the difficulty will be the centerpiece of all energy conversion: the turbine. Turbines are used in traditional energy plants run on coal, oil and natural gas, but they are also used in hydroelectric, geothermal, solar concentration (like Ivanpah), tidal, wind and waste-heat conversion. Without a thorough rethinking of turbine design, we will be hard pressed to find a true alternative.
This series will look at all forms of energy production, from fossil-fuel to experimental concepts and everything in between. We will begin, next, with a look at the problem of turbines.
Sponsored by 3DP-international
We interviewed Jack Wolosewicz, CTO of Eurocal Group (a New Tech Press sponsor) recently about the recurring stories regarding Android security.
A recent story from Reuters kicked up a bit of a stir by claiming that hackers could use sound to steal data from a computer or network. While the story was true it turns out that it isn't the whole story and is less of a problem than the report inferred, according to cybersecurity expert Jack Wolosewicz. Here's the interview.
The problems excessive heat causes the panels is significant. On a hot day, a panel can lose up 10-25 percent of its rated output and over time, consistent heat above 90 degrees Fahrenheit heat can actually cause permanent degradation of up to 30 percent in a couple of years. That information isn’t widely disseminated and few people, even those who sell the technology, knows the problem exists. That’s why when we were talking to PV vendors at the Intersolar Conference in San Francisco, the only people who could acknowledge the problem was at the FAFCO/SunWorld booth.
FAFCO has been providing passive solar water heating products and SunWorld PV panels since the 1970s. They have created several cogeneration systems over the years, but this is the first that marries the two technologies.
FAFCO has invented a heat exchange system known as CoolPV™ that is attached to a back of a typical 3x5 panel. Acting and looking very much like a radiator on a car, cold water is run through the system, drawing heat off of the panel. Currently, the most common use for the water is heating spas and swimming pools, according to FAFCO president Bob Leckinger.
Leckinger preferred not to give the cost of the system, but claimed it could be recouped within three years. How you might buy the technology is another question.
Leckinger said they are selling product now, but looking on either the SunWorld or FAFCO websites finds no information available on the new systems. And since the companies are focused on recreational uses it’s not likely it will be available for the general public any time soon. That is unfortunate because there are a lot of solar panel farms literally burning up in the American Southwest.
See full interview here:
By Lou Covey, editorial director
The electrical energy storage industry continued to grow in credibility this week at the Intersolar 2015 conference with a co-located show in Moscone West. However, as a possible indicator that it is still a very small market, the Intersolar folks put the show name all in lowercase (ees).
The sector is set to see the installed base grow 250 percent by the end of 2015, year to year, according to GTM research , but according to other reports, that represents a total investment of $2.6 billion world wide. As a comparison, Solar energy installations represent an investment of $172 billion as of the end of last year. The industry has no where to go but up.
There is no obvious leader rising in the ranks, except by general impression. Until this year the industry has done very little to distinguish itself until Elon Musk announced in May that Tesla will be offering home and industry storage products “real soon,” which was enough for lots of wealthy people that have electric cars and solar panels to put down a big chunk of cash to get their systems… sometime next year (A fool and his money…).
The reality is that the industry has been around for some time and selling products around the world relatively profitably, without a clear leader in the market. One would think that the attention being paid to the Tesla announcement might give them cause for jealousy, but that was not the case at ees. Every single company offering a storage system (and there were many) were practically salivating over their prospects.
“We are selling proven products with higher capacities and lower cost now than what Tesla says they are going to sell,” said Stefanie Kohl, marketing director of Sonnen-Batterie. “We made a decision to enter the US market early last year, and when Tesla made their announcement it was a nice gift to our marketing budget. Now everyone knows what it is and we can provide a better product for a better price." Being first to market is not always best.
Most companies offering storage products at ees called themselves a “market leader” for one reason or another, and Sonnen-Batterie calls itself “the German market leader.” It sold close to 4,000 units of its intelligent energy storage system to home owners, farmers and businesses since entering the German market in 2011. Germany has approximately 1.5 million solar installations currently and more coming every day, so Sonnen-Batterie has a way to go before they reach market saturation, but it seems a good start.
The investment community thinks so, too. Last December, Dutch and German investors sank put up $10 million to fund expansion.
The issue to be resolved is still cost per watt. Storage systems make sense for companies and residential applications when there is money to be spent. With solar installations producing power at $0.33 per watt, they are a pretty good deal over peak power costs from utilities, which is around $0.85 per war between noon and 6 p.m. But adding a storage system can make it a wash or even end up costing more.
So like all alternative energy technology, storage technology is still the realm of the wealthy. But it is a good start in the right direction.
At the 52nd Design Automation Conference in San Francisco, we talked to Rod Simon of OpenText about their collaboration platform, Exceed VA Turbo X that was introduced to the EDA industry at the conference. With a web-based interface, Exceed VA TurboX is a hybrid solution intended to improve users productivity by enhancing collaboration from any location, securely. Exceed VA TurboX is designed for the enterprise data center so administrators to easily manage and monitor access to sensitive applications and data. Here's the interview: