Aside

enter image description here

BlueData, a pioneer in Big Data private clouds, announced a technology preview of the Tachyon in-memory distributed storage system as a new option for the BlueData EPIC platform. Together with the company’s existing integration with Apache Spark, BlueData supports the next generation of Big Data analytics with real-time capabilities at scale, which allows organizations to realize value from their Big Data that wasn’t before possible. In addition, this new integration enables Hadoop, Hbase virtual clusters, and other applications provisioned in the BlueData platform, to take advantage of Tachyon’s high performance in-memory data processing.

Enterprises need to be able to run a wide variety of Big Data jobs such as trading, fraud detection, cybersecurity and system monitoring. These high performance applications require the ability to run in real-time and at scale in order to provide true value to the business. Existing Big Data approaches using Hadoop are relatively inflexible and do not fully meet the business needs for high speed stream processing. New technologies like Spark, which offers 100X faster data processing, and Tachyon, which offers 300X higher throughput, overcome these challenges.

Big Data is about the combination of speed and scale for analytics. With the advent of the Internet of Things and streaming data, Big Data is helping enterprises make more decisions in real time. Spark and Tachyon will be the next generation of building blocks for interactive and instantaneous processing and analytics, much like Hadoop MapReduce and disk-based HDFS were for batch processing,” said Nik Rouda, senior analyst of Enterprise Strategy Group. “By incorporating a shared in-memory distributed storage system in a common platform that runs multiple clusters, BlueData streamlines the development of real-time analytics applications and services.”

However, incorporating these technologies with existing Big Data platforms like Hadoop requires point integrations on a cluster-by-cluster basis, which makes it manual and slow. With this preview, BlueData is streamlining infrastructure by creating a unified platform that incorporates Tachyon. This allows users to focus on building real-time processing applications rather than manually cobbling together infrastructure components.

We are thrilled to welcome BlueData into the Tachyon community, and we look forward to working with BlueData to refine features for Big Data applications,” said Haoyuan Li, co-creator and lead of Tachyon.

The BlueData platform also includes high availability, auto tuning of configurations based on cluster size and virtual resources, and compatibility with each of the leading Hadoop distributions. Customers who deploy BlueData can now take advantage of these enterprise-grade benefits along with the memory-speed advantages of Spark and Tachyon for any Big Data application, on any server, with any storage.

First generation enterprise data lakes and data hubs showed us the possibilities with batch processing and analytics. With the advent of Spark, the momentum has clearly shifted to in-memory and streaming with emerging use cases around IoT, real-time analytics and high speed machine learning. Tachyon’s appealing architecture has the potential to be a key foundational building block for the next generation logical data lake and key to the adoption and success of in-memory computing,” said Kumar Sreekanti, CEO and co-founder of BlueData. “BlueData is proud to deliver the industry’s first Big Data private cloud with a shared, distributed in-memory Tachyon file system. We look forward to continuing our partnership with Tachyon to deliver on our mission of democratizing Big Data private clouds.”

Source: Inside-Bidata.com

Advertisements
Aside

Google-Cloud-Dataflow-Urs-Holzle-33-780x344
If you wish to process huge piles of data very, very quickly, you’re in luck.

From the comfort of your own data center, you can now use Google’s recently announced Dataflow programming model for processing data in batches or as it comes in, on top of the fast Spark open-source engine.

Cloudera, one company selling a distribution of Hadoop open-source software for storing and analyzing large quantities of different kinds of data, has been working with Google to make that possible, and the results of their efforts are now available for free under an open-source license, the two companies announced today.

The technology could benefit the burgeoning Spark ecosystem, as well as Google, which wants programmers to adopt its Dataflow model. If that happens, developers might well feel more comfortable storing and crunching data on Google’s cloud.

Google last year sent shockwaves through the big data world it helped create when Urs Hölzle, Google’s senior vice president of technical infrastructure, announced that Googlers “don’t really use MapReduce anymore.” In lieu of MapReduce, which Google first developed more than 10 years ago and still lies at the heart of Hadoop, Google has largely switched to a new programming model for processing data in streaming or batch format.

Google has brought out a commercial service for running Dataflow on the Google public cloud. And late last year it went further and issued a Java software-development kit for Dataflow.

All the while, outside of Google, engineers have been making progress. Spark in recent years has emerged as a potential MapReduce successor.

Now there’s a solid way to use the latest system from Google on top of Spark. And that could be great news from a technical standpoint.

“[Dataflow’s] streaming execution engine has strong consistency guarantees and provides a windowing model that is even more advanced than the one in Spark Streaming, but there is still a distinct batch execution engine that is capable of performing additional optimizations to pipelines that do not process streaming data,” Josh Wills, Cloudera’s senior director of data science, wrote in a blog post on the news.

Source: VentureBeat.com

Aside

Watson-Analytics-798x310

IBM’s Watson Analytics service is now in open beta. The natural language-based system, born of the same programme that developed the company’sJeopardy-playing super computer, offers predictive and visual analytics tools for businesses.

https://vine.co/v/Ov6uvi1m7lT/embed/simple

Early this summer, IBM announced it is investing more than $1 billion into commercializing Watson. Watson Analytics is part of that effort. The company promises that it can can automate tasks such as data preparation, predictive analysis and visual storytelling.

IBM will offer Watson Analytics as a cloud-based freemium service, accessible via the Web and mobile devices. Since it announced the programme in the summer, 22,000 people have registered for the beta.

The launch of Watson Analytics follows the announcement two months ago that IBM has teamed up with Twitter to apply the Watson technology to analysing data from the social network.

Source: TheNextWeb.com

 

Aside

A pivotal 2012 Gartner report predicted that data will grow 800 percent over the next five years and 80 percent of the data will be unstructured. As a result, savvy retailers are harnessing the power of big data by combining data from a number of sources such as web browsing patterns, social media and industry forecasts. They are using big data to predict trends, prepare for demand, optimize pricing, and ultimately to improve each customer’s experience to increase acquisition and retention rates. For example, supermarkets are combining their loyalty card data with social media information to detect and leverage changing buying patterns. While customers are getting timely offers and deals that match their needs.

Our friend Chris Landry over at Colourfast Printing produced the compelling infographic below that summarizes the retail industry’s interest in big data technology. It looks like 2015 will be a banner year for big data and retail!

Source: Inside-BigData.com

 

Aside

sentient

Sentient Technologies, a company seeking to solve the world’s most complex problems through massively scaled artificial intelligence, today announced it has raised $103.5 million in Series C funding that brings the total investment in Sentient to more than $143 million. Access Industries led the round with Tata Communications (Hong Kong) Limited (a wholly owned indirect subsidiary of Tata Communications Limited), existing investors Horizons Ventures, and a group of private investors in the fields of finance, consumer, food and beverage, and real estate, all participating. Sentient will use the funds to further expand its distributed artificial intelligence products and services.

As an investor, we share a common vision on the transformative force that massively distributed computing and artificial intelligence can play in helping businesses get insights and solve their most complex big data problems,” said Vinod Kumar, MD and CEO, Tata Communications.We see Sentient at the forefront of these technologies and bringing a disruptive approach to cloud based computing services. Furthermore, the scale of our leading global network infrastructure and data center footprint also complements Sentient’s growth plans and will enable its global deployment.”

BUILDING THE MOST POWERFUL INTELLIGENT SYSTEM IN THE WORLD

Sentient operates distributed artificial intelligence on an unprecedented scale, routinely running multiple distributed AI jobs, on millions of AI processing nodes, producing actionable results validated on large and complex data sets. Utilizing evolutionary computation and deep learning – designed to continuously evolve and improve – Sentient aims to create the world’s most powerful intelligent system. This system enables researchers, innovators and companies to solve mission-critical, high value, problems.

Sentient’s distributed artificial intelligence has unique, patented and powerful capabilities that address the distributed, varied, asynchronous nature of data and its continuous influx and growth in order to understand it and make accurate, actionable decisions.

As a director of my Chairman’s charitable foundation, The Li Ka Shing Foundation, I have been involved with the team at Sentient for several years,” said Frank Sixt, Executive Director and Group Finance Director of Hutchison Whampoa Limited. “I believe their unique resources and approach in focusing AI capabilities to achieve step changes in existing human capabilities in specifically targeted functional domains hold great potential for many businesses, including Hutchison’s own energy, retail, ports and telecommunications verticals.”

The Sentient team has been quietly demonstrating the capabilities of its system through deep research and testing in the fields of financial trading and medical research. These vertical “proving grounds” were chosen due to the high volume of data, wide variety of data types and high complexity of decision-making necessary for valuable and successful results.

Beyond using machine learning and deep learning to identify patterns or make predictions Sentient’s focus is on improved decision making. Sentient’s Distributed AI platform builds on those predictions, continuously improving its decision-making by utilizing evolutionary algorithms at a massive scale.

I believe that Sentient’s unique machine learning approach, deployed at nearly unprecedented scale, will provide solutions to complex, urgent problems in a wide range of fields,” said Adam Cheyer, co-founder & VP of Engineering at Viv Labs and advisor to Sentient. “In artificial intelligence, we are beginning to see that it’s not only about “big data,” but also about “big compute,” which gives the ability to search deeper and at a higher complexity to find the patterns and answers hidden within.”

PARTNERSHIP WITH TATA COMMUNICATIONS

In addition to their investment, Tata Communications will be a preferred provider to Sentient to furnish a suite of enablement services, including data center space, managed hosting and network. Sentient and Tata Communications will also partner to develop additional products and services.

We’re extremely encouraged with the progress that Sentient has made over the past year and we’re excited to have the support and funding of such a strong, global investor base,” said Antoine Blondeau, Chief Executive Officer, Sentient Technologies. “The new investment will allow us to continue to grow our team, further scale our partner infrastructure, and accelerate the commercialization of our distributed artificial intelligence technologies. Further, the expansion of our investor team brings a wealth of experience to our organization as we explore new vertical industries.”

Today, over 24% of the world’s Internet routes travel over Tata Communications’ global network which includes the world’s largest wholly owned sub-sea cable network and a tier 1 IP network that is ranked top five by routes in five continents – providing connectivity to over 240 countries and territories. The company also owns over 1 million square feet of data center and colocation space across 44 global locations.

CURRENT AND POTENTIAL CUSTOMERS

Sentient’s working model brings its distributed artificial intelligence technologies together with partner expertise, datasets and modeling challenges to solve complex, mission critical, high value problems. The company is seeking partners to explore solutions in the fields of healthcare, medical research, fraud detection, public safety, e-commerce, and other areas. Sentient is currently building APIs for faster and more independent partner development.

Making sense of massive amounts of data is critical for consumer-facing digital businesses”, said Jörg Mohaupt from Access Industries. “We are delighted to be investors in Sentient and will apply its technology to our portfolio of e-commerce, media and entertainment businesses so that they can do innovative things and create new products for their customers. We look forward to supporting Sentient in its growth and development.”

WORLD CLASS TEAM OF LEADERS AND ADVISORS

Sentient’s management and advisors include executives with years of experience in a variety of relevant industry sectors from companies such as AiLive, Amazon, Cerberus, Citigroup, Mozilla, NASA Ames, Rackspace, Salesforce.com, SAP, SIRI, SRI International, Sybase, TIBCO and Yahoo!. Sentient team members have decades of combined experience building products and solutions in the artificial intelligence, big data, cloud computing, finance and security industries and they are bringing that knowledge to bear as they develop Sentient’s products and solutions, bringing value to their business partners.

The MIT Computer Science and Artificial Intelligence Laboratory team worked with Sentient to deploy problems from our ICU blood pressure prediction analytics research”, said Una-May O’Reilly, Principal Research Scientist, MIT CSAIL. “Sentient’s unique evolutionary algorithm, mapped across tens of thousands of nodes, gave us access to a method that scales to vast resources and addresses highly complex problems. Sentient enabled us to solve problems previously thought too formidable to tackle due to scale.”

Source: Inside-BigData.com

Aside

Platfora

Platfora, the native big data analytics platform for Hadoop, has introduced an analytics solution for the Internet of Things (IoT) that enables business users to work with machine and sensor data at scale without significant support from the IT department. With Platfora, organizations can create a data-driven analytics culture that accelerates development of new products and services, and enables them to correlate multi-structured data across the enterprise to prevent risk and create a better customer experience.

Imagine IoT data as an iceberg. The challenge that organizations face is that the great majority of IoT data is below the waterline, which cannot be accessed by traditional analytics tools without long cycles of data preparation,” said Ben Werther, founder and CEO of Platfora. “If you can’t easily work with data below the waterline, you cannot participate in the IoT era because it’s impossible to answer complex behavioral questions that arise when analyzing these new types of machine, transactional and customer interaction data.”

How Enterprises are Using Platfora for IoT

There are few better examples of business success in the IoT era than Vivint, a leading U.S.-based provider of smart home technology including solar, security and control solutions. The company deployed Platfora to provide new insights and conduct visual analytics on machine and sensor data gathered in its ecosystem of enterprise data systems and millions of home automation devices. During one product quality monitoring exercise, analysts discovered anomalies in a small group of customer home systems. They could not pinpoint the problem using traditional analytics tools, so analysts worked iteratively in Platfora with various sensor and machine data sets until they found a consistent pattern, which turned out to be a minor manufacturer’s defect. This discovery enabled Vivint to roll service trucks to customer homes and solve the problem before most of them became aware of the situation.

The Industry’s First Analytics Platform Built for IoT

Platfora Big Data Analytics is a full-stack, native-Hadoop platform that enables analysts, business professionals and data scientists to iteratively work with data in the rawest forms, so they don’t have to structure data and form queries beforehand. Users work collaboratively in an intuitive, visual analytics environment, and can achieve insights on IoT-native data varieties (i.e. JSON, XML) with billions of unique values. With Platfora’s analytics solution for IoT, users can perform advanced data functions including:

  • Segmentation. Partition groups of connected devices and machines based on patterns of behavior and attributes, and via advanced analysis. Segmentation takes the anonymity of big data sets and makes them manageable and actionable for analysts.

  • Deep behavioral analysis. The ability to correlate the behavior of devices and data across the extended enterprise to conduct path analyses that reveal system success or failure, and device dependencies. This capability supports new product development, existing product performance analysis and security risk profiling, among other IoT use cases.

Providing non-technical employees with self-service access to data in Hadoop is creating lots of new business opportunities for our company and is helping us deliver a better customer experience,” said Brandon Bunker, senior director of customer analytics and intelligence at Vivint. “Platfora Big Data Analytics is purpose-built for these kinds of tasks. The platform is lightning fast, presents our data in a visually stunning way, and it’s surprisingly easy to use.”

Aside

ibm-twitter_logoTwitter and IBM (NYSE: IBM) have announced a landmark partnership that will help transform how businesses and institutions understand their customers, markets and trends – and inform every business decision. The alliance brings together Twitter data that distinctively represents the public pulse of the planet with IBM’s industry-leading cloud-based analytics, customer engagement platforms, and consulting services.

The collaboration will focus on three areas:

Integration of Twitter data with IBM analytics services on the cloud: IBM plans to offer Twitter data as part of select cloud-based services, including IBM Watson Analytics, a new cognitive service in the palm of your hand that brings intuitive visualization and predictive capabilities to business users; and a cloud-based data refinery service that enables application developers to embed data services in applications. Entrepreneurs and software developers will also be able to integrate Twitter data into new cloud services they are building with IBM’s Watson Developer Cloud or IBM Bluemix platform-as-a-service.

New data-intensive capabilities for the enterprise: IBM and Twitter will deliver a set of enterprise applications to help improve business decisions across industries and professions. The first joint solution will integrate Twitter data with IBM ExperienceOne customer engagement solutions, allowing sales, marketing, and customer service professionals to map sentiment and behavior to better engage and support their customers.

Specialized enterprise consulting: IBM Global Business Services professionals will have access to Twitter data to enrich consulting services for clients across business. Additionally, IBM and Twitter will collaborate to develop unique solutions for specific industries such as banking, consumer products, retail, and travel and transportation. The partnership will draw upon the skills of tens of thousands of IBM Global Business Services consultants and application professionals including consultants from the industry’s only integrated Strategy and Analytics practice, and IBM Interactive Experience, the world’s largest digital agency.

Twitter provides a powerful new lens through which to look at the world – as both a platform for hundreds of millions of consumers and business professionals, and as a synthesizer of trends,” said Ginni Rometty, IBM Chairman, President and CEO. “This partnership, drawing on IBM’s leading cloud-based analytics platform, will help clients enrich business decisions with an entirely new class of data. This is the latest example of how IBM is reimagining work.”

With the development of new solutions to improve business decisions across industries and professions, IBM and Twitter will be able to enrich existing enterprise data streams to improve business decisions. For example, the integration of social data with enterprise data can help accelerate product development by predicting long-term trends or drive real-time demand forecasting based on real-time situations like weather patterns.

When it comes to enterprise transformation, IBM is an undisputed global leader in enabling companies to take advantage of emerging technologies and platforms,” said Dick Costolo, Twitter CEO. “This important partnership with IBM will change the way business decisions are made – from identifying emerging market opportunities to better engaging clients, partners and employees.”

IBM has established the world’s deepest portfolio in big data and analytics consulting and technology expertise based on experiences drawn from more than 40,000 data and analytics client engagements. This analytics portfolio spans research and development, solutions, software and hardware, and includes more than 15,000 analytics consultants, 4,000 analytics patents, 6,000 industry solution business partners, and 400 IBM mathematicians who are helping clients use big data to transform their organizations.

IBM brings a unique combination of cloud-based analytics solutions and a global services team that can help companies utilize this truly unique data,” said Chris Moody, Vice President of Twitter Data Strategy. “Companies have had successes with Twitter data – from manufacturers more effectively managing inventory to consumer electronic companies doing rapid product development. This partnership with IBM will allow faster innovation across a broader range of use cases at scale.”

For more information regarding the new Twitter and IBM collaboration, please visit www.ibm.com/IBMandTwitter or https://blog.twitter.com/ibm.

Aside

detective_jpg_800x600_crop_q85

Small criminals are predictable, at least that’s what London’s Metropolitan Police Service (MPS) are hoping. New software, developed by Accenture, pulls large amounts of data in-use by the police service and puts it through an advanced analytics engine to predict when criminals are likely to strike.

The engine looks at aspects of an individual’s record including; geography, past offenses, associations, and even keeps an eye on social media postings. Through analysis of five years’ worth of data, a picture can be painted of when / if a criminal will re-offend.

Privacy campaign group Big Brother Watch is requesting more information to be made public.

Accenture highlights the fact that police forces up and down the country are seeing funding cuts, and therefore experiencing problems with limited resources. The ability to effectively allocate these precious resources is important, and big data analysis helps to save cost whilst ensuring such a vital service is unaffected.

Social media is monitored to watch-out for inflammatory comments such as taunts of other gang members, or the organising of a criminal activity. The data was gathered over a four year period of monitoring gang members across 32 boroughs, and was then compared to criminal acts in the fifth year to see whether the software was accurate.

In terms of public reception, this could be seen as invasive after Edward Snowden’s NSA revelations about mass surveillance. The public is more likely to be acceptant if the potential benefits are clear, but privacy campaign group Big Brother Watch is requesting more information to be made public about the initiative.

Although this is said to be the first time Accenture’s analytics have been used in the UK, the firm’s software has been used for similar reasons in Spain, and in Singapore the company tested software which monitors video feeds of crowds, traffic and other events to alert the authorities to potential risks.

“It is clear that harnessing and analysing vast data sets may simplify the work of the police,” said European human rights group Statewatch earlier this year

“However, this in itself is not a justification for their use. There are all sorts of powers that could be given to law enforcement agencies, but which are not, due to the need to protect individual rights and the rule of law – effectiveness should never be the only yardstick by which law enforcement powers are assessed.

“The ends of crime detection, prevention and reduction cannot in themselves justify the means of indiscriminate data-gathering and processing.”

Should police be using data analytics to predict future offences? Let us know in the comments.

 

Source: cloudcomputing-news.net

Aside

BigPandaLogoBigPanda formally launched a new data science platform to automate IT Incident Management. BigPanda’s platform analyzes the flood of alerts that IT teams face every day and clusters them into high-level incidents; it then automates the manual processes involved with detecting, investigating and collaborating around every IT incident. This enables companies to resolve IT issues faster and minimize their impact on customers and revenue.

Data centers have changed dramatically in the last decade. IT and DevOps teams are struggling with traditional approaches to Incident Management that have not kept pace with those changes.

Two major changes, in particular, have been a major source of pain:

  1. Data Centers have Exploded in Scale and Complexity due to the Cloud and Virtualization. As the moving parts have multiplied, so has the number of IT incidents that require immediate attention. The average BigPanda user has thousands of daily alerts, and that number is growing exponentially. Today, that mountain of alerts must be manually detected, investigated and managed by IT and DevOps teams, which has turned into major drain on time, people and efficiency.
  2. Data Center Monitoring has become Highly Fragmented. Companies are shifting away from monolithic data center monitoring vendors, like HP and IBM, towards using multiple tools such as Splunk, New Relic, Nagios, Zabbix and Pingdom. Companies use on average five different monitoring tools, none of which speak the same language. When IT incidents occur, correlating alerts and connecting the dots between all those fragmented tools is a time-consuming and error-prone task.

Despite these changes, companies continue to use Incident Management solutions that have not evolved to meet these new challenges. Legacy solutions focus on helping teams to organize and track their activities, but leave the heavy lifting of detecting, investigating and collaborating on alerts up to individuals. That has left IT and DevOps teams struggling to keep up with new challenges arising from data center scale and fragmentation.

The new generation of IT infrastructure requires a fundamentally different approach to Incident Management,” said Assaf Resnick, Co-Founder and CEO of BigPanda. “We believe that only through leveraging data science can IT teams tackle the scale of machines, events and dependencies that must be understood and managed. That’s why we founded BigPanda.”

BigPanda’s core innovation lies in its data science approach that enables automation of the time-consuming process involved in responding to IT issues. They do this through a SaaS platform that aggregates and normalizes alerts from leading monitoring systems, such as New Relic, Nagios and Splunk, as well as home-built monitoring solutions, and then leverages powerful data algorithms to automate the heavy lifting out of Incident Management. BigPanda:

  • Consolidates Noisy Alerts: BigPanda automatically clusters the daily flood of alerts into high level incidents, so IT can quickly see critical issues without having to dig.
  • Correlates Alerts and Changes: BigPanda correlates IT incidents with the code deployments and infrastructure changes that may have caused them, so IT and DevOps teams have instant access to the data they need to make smart decisions quickly.
  • Streamlines Collaboration: BigPanda makes it easy to notify the right people and keep everyone updated on incident status, notes, activities, metrics, and more. It syncs seamlessly with ServiceNow, JIRA and Remedy, which frees IT from having to manually manage tickets and keep them up-to-date.

Any modern Ops environment at scale will hit the pain-points BigPanda is solving. There’s a strong need for this product,” said Kevin Park, Head of Tech Ops and IT at Dropbox.

Pricing and Availability
BigPanda is available for a free 30-day trial. A lite version is available for free; pricing starts at $1,500 per month for a company-wide license.

Source: Inside-Bigdata.com

 

Aside

Gartner presented their top 10 strategic technology trends for 2015 at their annual Gartner Symposium/ITxpo 2014 held in Orlando earlier this month.  Computing Everywhere, the Internet of Things (IoT) and 3D Printing are projected to be the three most important strategic technology trends in 2015.

3D Printing Will Continue To Revolutionize Prototyping And Manufacturing 

3D printing is forecast to reach a tipping point in the next three years due to streamlined prototyping and short-run manufacturing. Improving time-to-market, ensuring greater accuracy of highly customized products, and reducing production costs over the long-term are three of the many benefits companies are adopting 3D printing for today.  Be sure to read Larry Dignan’s excellent post covering the conference and top ten strategic technology trends, 3D printing turns strategic in 2015, says Gartner.

Taking Analytics To The Next Level in 2015

Advanced, persuasive and invisible analytics, context-rich systems, and smart machines also are included in the top 10 strategic technology trends for 2015. Given how quickly analytics is maturing as a technology category, it’s understandable why Gartner ranked this area as the 4th most strategic.  In 2015, analytics will move beyond providing dashboards with metrics and Key Performance Indicators (KPIs) to a more intuitive series of applications that give business analysts the flexibility to define models and test them in real-time. Alteryx and Tableau are interesting companies to watch in this area and Tableau Public is worth checking out and learning due to its advanced visualization features (free, opt-in).

Cloud Computing Becomes Part Of The New IT Reality

The last four technology trends Gartner mentions include cloud/client computing, software-defined applications and infrastructure, Web-scale IT and risk-based security and self-protection.

The following graphic provides an overview of the top 10 strategic technology trends for 2015.

 

gartner-top-2015-tech-620x334

Source: SmartDataCollective.com