TelecomTV
  • News
  • Videos
  • Channels
  • Events
  • Directory
  • Webcasts
  • Surveys
  • Debates
  • Perspectives
  • More
  • Directory
  • Webcasts
  • Surveys
  • Debates
  • Perspectives
  • Follow TelecomTV
  • About
  • Help
  • Contact
  • Follow TelecomTV
  • About
  • Help
  • Contact
  • Sign In Register Subscribe
    • Subscribe
    • Sign In
    • Register
  • Search

Tracker

TelecomTV TRACKER

Sourced by TelecomTV's TRACKER platform
from Hitachi Newsroom

Tracker

Hitachi develops open source software based big data analytics technology to increase speed by up to 100 times

Via Hitachi Newsroom

Nov 14, 2017

For a high-speed analytics system with lower IT investment

[image]Overview of the technology developed

Overview of the technology developed

Tokyo, November 14, 2017 --- Hitachi, Ltd. (TSE: 6501, Hitachi) today announced the development of the technology increasing the speed of big data analytics on an open source software Hadoop-based distributed data processing platform1 ("Hadoop platform") by a maximum of 100 times that of a conventional system. This technology converts data processing procedure generated for software processing in conventional Hadoop data processing, to that optimized for parallel processing on hardware, to enable high-speed processing of various types of data in FPGA2 . As a result, less number of servers will be needed when conducting high-speed big data analytics, thus minimizing IT investment while enabling interactive analytics by data scientists, quick on-site business decision making, and other timely information services. This technology will be applied to areas such as finance and communication, and through verification tests, will be used to support a platform for data analytics service.

In recent years, big data analytics for interactively analyzing large amounts of various types of data from sources such as sensor information in IoT, financial account transaction records and social media, under various conditions and from various perspectives for business and services, is becoming increasingly important. The open source Hadoop platform is widely used for such analytics, however as many servers are required to raise processing speed, issues existed in terms of equipment and management costs.

In 2016, Hitachi developed high performance data processing technology using FPGA*3 . As this technology however was developed for Hitachi's proprietary database, it could not easily be applied to the Hadoop platform as it employed a different data management method and used customized database management software.

To address this issue, Hitachi developed technology to realize high-speed data processing on the Hadoop platform utilizing FPGA*4 . Features of the technology developed are outlined below.

(1) Data processing procedure conversion technology to optimize FPGA processing efficiency

The Hadoop platform data processing engine optimizes data processing using the CPU to serially execute software to retrieve, filter and compute. Simply executing this procedure however does not fully exploit the potential of the hardware to achieve high-speed processing through parallel processing. To overcome this, the Hadoop processing procedures were analyzed, and taking into consideration distributed processing efficiency, technology was developed to convert the order of the processing commands to that optimized for parallel processing on FPGA. This will enable the FPGA circuit to be efficiently used without loss.

(2) Logic circuit design to analyze various data formats and enable high-speed processing in FPGA

Conventionally in FPGA processing, to facilitate processing on the hardware, the formats of different types of data, such as date, numerical value and character string, was restricted, and dedicated processing circuits were required for each type of data. The Hadoop platform however needs to deal with multiple data formats even for the same item, for example, even with dates there is the UNIX epoch day expression as well as the Julian day expression among others. Thus, as many dedicated processing circuits would be needed, the limited FPGA circuitry could not be effectively used with conventional FPGA processing. To resolve this issue, a logic circuit was designed to optimize parallel processing in FPGA, using parser circuits that clarify various data types and sizes*5 and depending on the data type and size, packs multiple data to be processed in one of the circuits. As a result, it is possible to not only handle various data formats but also realize parallel processing fully utilizing filtering and aggregation circuits for efficient high-speed data processing.

The technology developed was applied to the Hadoop platform. When analytics was performed on sample data, it was found that data processing performance improved by up to 100 times. The results suggest it will be possible to reduce the cost of Hadoop-based big data analytics as the number of servers required for high-speed processing can be significantly reduced. Hitachi will now conduct verification tests together with customers as it works towards the commercialization of this technology.

The technology developed will be on exhibit at SC17 - The International Conference for High Performance Computing, Networking, Storage and Analysis, to be held from 13th to 16th November 2017 in Denver, Colorado, USA.


*1 Hadoop-based distributed data processing platform: A computation platform for storing and analyzing large amount of data on distributed servers using open source software, "Hadoop"

*2 FPGA (Field Programmable Gate Array): An integrated circuit manufactured to be programmable by the purchaser. In general, FPGA is inexpensive compared to application specific circuits.

*3 3rd August 2016 News Release: "Hitachi develops high performance data processing technology increasing data analytics speed by up to 100 times"

*4 10 related international patents pending

*5 Supports the standard format "Parquet," generally used in open source data processing platforms such as Hadoop

Related Topics
  • Announcement,
  • Asia-Pacific,
  • Data & Analytics,
  • News,
  • Open Source,
  • Tracker

More Like This

IoT

Fleet Space Technologies books 3 million sensors on its satellite IoT network

Feb 15, 2019

Core Network

Broadband Forum delivers recommendations for converged 5G core network to 3GPP

Feb 15, 2019

IoT

Next-generation Armv8.1-M architecture: Delivering enhanced machine learning and signal processing for the smallest embedded devices

Feb 15, 2019

Core Network

Turkcell collaborates with Huawei to build a 5G-oriented all-cloud core network

Feb 15, 2019

IoT Services & Applications

Mission Critical Applications to Drive the Adoption of 5G sUAVs

Feb 15, 2019

This content extract was originally sourced from an external website (Hitachi Newsroom) and is the copyright of the external website owner. TelecomTV is not responsible for the content of external websites. Legal Notices

Email Newsletters

Stay up to date with the latest industry developments: sign up to receive TelecomTV's top news and videos plus exclusive subscriber-only content direct to your inbox – including our daily news briefing and weekly wrap.

Subscribe

Top Picks

Highlights of our content from across TelecomTV today

3:17

Are the incumbent CSPs safe, or is their future business strategy at risk?

7:09

The minimum viable product architecture for Edge computing

Debate 3: Progress in softwarising and virtualising the world’s telcos

3:15

The Route from CSP to DSP is Open

  • TelecomTV
  • Decisive Media

TelecomTV is produced by the team at Decisive Media

Menu
  • News
  • Videos
  • Channels
  • Directory
  • Webcasts
 
  • Surveys
  • Debates
  • Perspectives
  • Events
  • About Us
Our Brands
  • TelecomTV Tracker
  • TelecomTV Perspectives
  • DSP Leaders Forum
  • Open Service Provider
  • The Great Telco Debate
Get In Touch
info@telecomtv.com
+44 (0) 207 448 1070

Request a Media Pack

Follow
  • © Decisive Media Limited 2019. All rights reserved. All brands and products are the trademarks of their respective holder(s).
  • Legal Notices