Big Data

Secure collaboration is a quiet trend at #52DAC

By Lou CoveyEditorial Director

While outsourcing software and design development is a common practice, the idea of putting your company’s crown jewels into the cloud for a freelancer to monkey with tends to drive sales of anti-emetics. Can you safely allow virtual strangers to access your server, or should you just suck it up and overwork your employees?

That has been a continuous conundrum of the Electronic Design Automation Industry (EDA) and its customers in the embedded software and semiconductor industries. Larger companies, like Synopsys and Intel either use internal security paradigms in the collaborative tools or work with some of the big players, like IBM and OpenText. The costs of those tools however don’t always fit in the budget for smaller companies and can be a hindrance to outsourcing companies.

What makes the whole issue more difficult is that while companies readily admit is is an important issue, not many are actually willing to talk about what they are doing about it.

At the Design Automation Conference in San Francisco this week, there was a noticeable presence of companies stating they actually do provide for secure collaboration  and were more than willing to tell you who they provided it for. One of the main players, OpenText, customers proudly proclaims their list of customers, including, in the electronics world, Alcatel-Lucent, Cirrus Logic and Renesas (see interview here).

Other players, like the recently funded Zentera, not so much. We visited Zentera’s booth at the Design Automation Conference and they were quite adamant about not saying anything substantial on the record, but their website touts a lot of partners, including Microsoft and Qualcomm.

Then you get into the realm of the EDA tool providers and the walls go up quickly. Mentor Graphics expressed surprise that one of their major customers, Qualcomm, was working with Zentera to provide secure collaboration. Synopsys and Cadence claim their own “cloud” solution, consisting of private servers stuffed in their headquarters building.

Dassault Systeme, on the other hand, was quite effusive about their Enovia collaborative platforms and focuses security according to roles, geography and hierarchy. Dassault is relatively new to the world of semiconductor design and is making a strong effort to differentiate itself from the “holy trinity” of Synopsys, Mentor Graphics and Cadence, and they have been miles ahead of the EDA industry on the issue of collaboration and security, simply because of their much broader range of customers including the mil-aerospace niches that require a standardized approach.

For third-party providers of design services these secure collaboration platforms can open doors for working with the most cutting-edge technologies that are often strapped for resources. Customers that want to integrate design environments from multiple sources can use them to integrate the external design teams into an all encompassing environment without giving up those aforementioned crown jewels. If the customer doesn’t want the additional expense, it might be worth the investment by outsourcers to adopt the collaboration platforms and work the cost into their services overall.

ARM, x86, GPUs not filling the bill for HPC

by Lou CoveyEDITORIAL DIRECTOR

The data industry has been limping along with processing systems designed for personal computing and jury-rigged for big data and high performance computing. It has worked for light-touch computing applications like Facebook, but for compute-intensive applications like the Oculus Rift virtual environment and Oracle's database in-memory, the hardware infrastructure is woefully insufficient (even Oracle's hardware).

The Rube-Goldbergesque contrivances that have brought us this far have made it easy to ignore investment in new approaches, but they will fail us in the foreseeable future and retard growth in data-intensive enterprises.  And the efforts of the design automation industry to fix the insufficiency through additional kluges are as unprofitable.

New Tech Press met with the renowned computer scientist John Gustafson who confirmed that the infrastructure cornerstones (the x86 platform of Intel and ARM cores) will not bring us into the new age of high-performance computing, even with help from graphics processors from NVIDIA and AMD. Gustafson posits we need something new, and that investment in that yet-to-be determined new platform needs to happen now.

Big Data: Privacy vs. Benefits

By Joe BasquesManaging Editor

A recent report by IBM said 2.5 exabytes of data were created every day through 2012. This is almost nothing compared to what we will collect in 3 to 5 years as the Internet of Things moves into reality and everything from milk jugs to the clothes we wear will contain sensors actively collecting data.

Two of the biggest challenges businesses face today are where to begin when developing their Strategic Big Data Plan, and how to lessen the “creepy factor” so customers willingly consent to contributing their data. Gartner predicts that one-third of Fortune 100 companies will experience an information management crisis by 2017, due to the fact that many U.S. companies don't have a clear data strategy.

As part of our ongoing series looking at the latest in Big Data, we sat down with Ann Buff, Business Solutions Manager and Thought Leader for SAS at Enterprise Data World in Austin Texas to discuss Big Data Strategy and how companies overcome the “creepy factor” to provide a high-value proposition to their customers.