Categories
AI

Oracle revamps cloud analytics service to simplify access

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Oracle today unveiled a revamped cloud analytics service that aims to reach a wider range of users via a Redwood user interface (UI) the company is publicly showing for the first time within a live application environment.

The UI will eventually be employed across the entire Oracle applications portfolio, Oracle Analytics VP Joey Fitts told VentureBeat.

The Redwood UI is at the core of an Oracle Analytics Cloud strategy that surfaces a common pool of data to end users, business analysts, and data scientists, rather than requiring organizations to acquire, populate, and manage data across multiple platforms to address each use case, Fitts added. The goal is to make it easier for users with varying levels of analytics expertise to collaborate more effectively, he explained.

More updates

Today Oracle is also launching a mobile application that makes Oracle Analytics Cloud more accessible to members of a geographically distributed team. That application includes a “podcast” capability that leverages a natural language processing (NLP) engine to identify and narrate the relationships between various sets of data surfaced through a dashboard via a speech interface. Oracle Analytics Cloud allows users to query data in natural language using either text or speech in 28 different languages.

And Oracle is expanding its machine learning capabilities to offer users simple explanations of the factors that influenced a recommendation. Users can employ those explanations to adjust factors in a way that fine-tunes results. That capability makes artificial intelligence (AI) capabilities accessible to all types of users, Fitts said, adding “AI should be both applied and invisible.”

At the same time, Oracle is adding support for built-in text analytics, affinity analytics to discover relationships between datasets more easily, graph analytics, and custom map analytics for embedding images using the Web Map Service (WMS) protocol and XYZ tile layers.

Oracle is also adding a data profiling engine that samples and scans data to identify quality issues and proactively flag the misuse of sensitive data, along with recommendations to fix issues, such as zip codes and data in end user-defined product categories. Data preparation tools will also automatically associate geographic content to the right type of visualization.

Cloud era

As a provider of relational database platforms that are widely employed in on-premises IT environments, Oracle is moving to ensure it remains relevant in the age of the cloud. In addition to Oracle Analytics Cloud, the company makes available a managed Autonomous Database service through which lower-level database administration tasks are automated. Regardless of use case, Oracle is encouraging customers to employ one of its cloud services rather than rival database and analytics services provided by Amazon Web Services (AWS), Microsoft, and Google.

It’s too early to say how that titanic battle might play out, but Oracle — and to a lesser degree Microsoft — has a strategic advantage, in that the bulk of data still resides in an on-premises IT platform it provides. As hybrid cloud computing continues to evolve, it becomes easier for IT organizations to federate the management of data across multiple platforms using an incumbent vendor than it is to replace entire on-premises environments. AWS and Google are both making hybrid cloud computing cases that would require organizations to replace existing infrastructure or migrate all of their data into a cloud platform.

There are plenty of examples of organizations deciding to abandon local datacenters entirely. But many companies continue to deploy applications in on-premises IT environments, citing compliance, security, and performance advantages. After more than 10 years of cloud computing, the bulk of enterprise data remains in an on-premises IT environment, which suggests most organizations will continue to selectively migrate applications to the cloud at a time and place of their choosing.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Computing

Razer revamps the Blade Pro 17 with 9th-gen Intel Core and Nvidia RTX in a trimmer package

The Razer Blade Pro 17 is one of the first laptops out of the gate with two shiny new parts: Intel’s 9th-gen Core i7 mobile chip, with 6 cores and 12 threads of processing goodness; and Nvidia’s RTX 2080 with Max-Q. Further improvements include a trimmer overall package, narrow bezels, and connectivity to spare. Prices start at $2,499, and product will start shipping in May. 

The Razer Blade Pro 17 was long overdue for a refresh, and almost everything is new—and more compact. The chassis is 25 percent smaller than the predecessor in overall volume, and about 20 percent lighter at 6 pounds, compared to 7.4 pounds for the prior model. The bezels are 6mm thin and surround the sole display option, a 1080p/144Hz non-GSync panel. 

The laptop is actually slightly thicker than before, but Razer did that to maximize thermal performance and fit in a bit more connectivity. That includes five USB 3.2 Gen 2 ports—three Type-A and two Type-C; one of the latter also supports Thunderbolt 3. You also get HDMI 2.0b, Realtek 2.5Gb ethernet, and a UHS-III SD card, which Razer says is faster than the slot integrated into prior models. 

Why this matters: The Razer Blade Pro 17 is catching the wave of next-gen hardware from major manufacturers. Both Intel’s 9th-gen Core mobile chips and Nvidia’s RTX mobile GPUs for gamers promise to make laptops ever-more competitive with desktops for all but the most hardcore gamers. We’ll let you know how close if we get a chance to test the new Blade Pro 17.  

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.

Repost: Original Source and Author Link

Categories
AI

IBM revamps its storage lineup to better enable hybrid cloud computing

As part of a larger effort to make it easier to manage data across a hybrid cloud computing environment, IBM unveiled a 1u all-flash storage system for on-premises IT environments that can scale to hold 1.7 petabytes (PB) of data.

The amount of storage capacity required by IT organizations that are training AI models using data that, for compliance and security reasons, can’t be shifted to a cloud computing environment, is steadily increasing, said Denis Kennelly, general manager for IBM Storage.

The FlashSystem 5200i, with data capacity starting at 38 TB, is now the entry-level member of the IBM all-flash family of storage system that IBM is now offering for 20% less than its predecessor. IBM is also adding additional 2u models to the FlashSystem series that are designed to deliver higher I/O performance.

The IBM storage systems are unique in that they are all compatible with the IBM Spectrum storage management software that IBM makes available on its cloud as well as Amazon Web Services (AWS), said Kennelly. IBM also committed to making IBM Spectrum Virtualize for Public Cloud software available on the Microsoft Azure Cloud in the third quarter. That capability is critical because it enables IT teams to replicate and migrate data across hybrid cloud computing environments, added Kennelly.

Finally, IBM also announced today that next month it will add support for IBM Cloud Satellite to its FlashSystem systems as well as IBM SAN Volume Controller, IBM Elastic Storage System, and IBM Spectrum Scale software. IBM Cloud Satellite, currently in beta, is a management platform that IBM created to centralize the management of hybrid cloud computing environments. IBM Cloud Satellite is built on an instance of the Red Hat OpenShift platform running on Red Hat Enterprise Linux (RHEL), which makes it possible to deploy the management platform anywhere.

In general, the ability to move data between multiple clouds and on-premises IT environments has become a critical requirement as the centers of data gravity in the enterprise continue to shift, said Kennelly. Organizations need to be able to flexibly move and replicate data that needs to be accessed by a growing number of applications running on different platforms. It’s not always feasible or practical to remotely access data when many of the applications running are increasingly latency-sensitive thanks in part to increased reliance on microservices.

At the same time, the amount of data that is being accessed is increasing as organizations look to infuse AI capabilities into their applications. The AI models being constructed require access to massive amounts of data.

Collectively, all those requirements create a need to be able to manage and govern data more efficiently than ever, said Kennelly.

“Data is the new oil for business,” said Kennelly. “But data, like kerosene, in the wrong hands is a dangerous thing.”

Ultimately, traditional IT operations will need to absorb what is today often separate DataOps and machine learning operations (MLOps) disciplines that have emerged around data science initiatives, said Kennelly. In the meantime, IBM is making a case for an approach to storage that will, longer term, make it easier to achieve that goal.

IBM, of course, is not the only provider of storage and data management platforms with similar ambitions. However, now that IBM has positioned IBM Cloud as one platform among many it supports, its entire approach to hybrid cloud computing continues to evolve. The challenge, of course, is that hybrid cloud computing requires a lot more to achieve than simply accessing compute resources on different platforms. The data those compute engines need to access needs to be just as readily accessible whenever and wherever required.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Repost: Original Source and Author Link