X
    Categories: Tech

IBM Machine Learning is a data mining AI for the private cloud

IBM launches machine learning for mainframe computers on the cloud for private companies. Image: TheTechNews.

On Wednesday, IBM announced a new cognitive platform for mainframe computers: IBM Machine Learning. The technology will be available across all z System models, enhancing them to deal with data-heavy analytics tasks in the private cloud.

The company drew the primary power features of IBM Machine Learning from Watson, its artificial intelligence engine. It will enable large enterprises managing data on z/OS to create and deploy analysis models seamlessly and in almost any language and framework.

Only zSeries mainframe computers will support the technology for now, but it will soon be available for POWER Systems as well. Traditional customers of IBM will have in their hands the ability to enhance performance the way they do in the cloud.

What are mainframe computers?

Experts work on a huge IBM computer. Image: Business Insider.

Mainframes are large format computers developed and manufactured by IBM since the mid-50s. These machines can handle substantially more data and perform more demanding tasks than regular, consumer-oriented towers.

The company’s target market for mainframes has always been data-sensitive organizations, the ones still using z System computers nowadays. Governments, banks, insurance companies, major retailers, and transportation businesses use these models to store and analyze their data.

Due to their nature, such data is kept private in rooms full of IBM’s servers, not available to anyone other than data scientists and specialists working for these organizations.

Modern models of the zSeries running on z/OS can tackle as much as 2.5 billion transactions per day. Such volume of purchases and sales is the equivalent of roughly 100 Cyber Mondays in just 24 hours.

What can IBM Machine Learning do?

IBM’s Watson capabilities are available to anyone on the cloud, but in close-ended systems such as mainframes, scientists need to pull all the work in calculating, creating, testing, and deploying models to analyze their huge amounts of data.

“OVER 90 PERCENT OF THE DATA IN THE WORLD CAN’T BE GOOGLED. IT RESIDES BEHIND FIREWALLS ON PRIVATE CLOUDS. HOW DO WE AUTOMATE INTELLIGENCE FOR THESE DATA SOURCES?” asks IBM Analytics GM Rob Thomas.

With the rollout of IBM Machine Learning, the Big Blue basically wants to offer an enterprise solution to its most treasured customers. Experts can now harness the power of artificial intelligence to automate these processes and enhance both performance and efficiency.

The platform will support models in popular programming languages of the industry such as Python and Scala, frameworks like Apache SparkML and TensorFlow, and get smarter over time by deducting which algorithms work best for which task.

Source: IBM

Rafael Fariñas:
Related Post