Categories
AI

How Onyxia uses security AI to help CISOs improve their security posture

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Managing cybersecurity risks is challenging, not necessarily because vulnerabilities are hard to find, but because most organizations rely on manual processes to do so. However, security AI has the potential to automatically measure risks in the environment, and provide recommendations on what to address first. 

Security provider Onyxia, which launched today with $5 million in seed funding, demonstrates this approach by enabling organizations to use artificial intelligence (AI) to monitor their security posture in real time. 

As complexity increases in modern networks, AI-driven solutions will become more important for identifying gaps in an enterprise’s defenses, and reduce the chance of threat actors being able to exploit any vulnerabilities. 

Using security AI to mitigate risk 

The key challenge of mitigating cyber-risks is to understand that the level of risk isn’t static, but changes as technology and users in the environment move in and out. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

In environments that aren’t driven by AI, security teams and CISOs can struggle to keep up with the rate the environment changes. At the same time, the pace of work makes it difficult to make accurate judgment calls on which security risks to address first to improve the overall security posture of the organization. 

By using AI, an organization can eliminate this guesswork and start accurately assessing what actions they can take to better secure their environments.

Helpnet Security reported that out of 3,800 CISOs surveyed, 61% of security teams are understaffed and 69% say that hiring managers don’t accurately understand their company’s cybersecurity hiring needs, adding training and educational responsibilities that most IT teams cannot spare,” said Sivan Tehila, CEO of Onyxia. 

“Currently, security priorities are shifting as 90% of organizations fail to address cybersecurity risks. Onyxia enables CISOs and security teams to gain a holistic view of their entire cybersecurity environment while highlighting the best solutions and strategies to close security gaps, filling in the gaps that they didn’t know existed,” Tehila said. 

Onxyia is well-placed to meet these challenges, given founder Sivan Tehila’s technology pedigree, previously serving as CISO of the research and analysis division and head of information security for the Israeli Defense Force (IDF). 

The vendor’s solution uses machine learning (ML) and AI to provide CISOs with custom suggestions on how to improve their organization’s security posture. Choice are based on industry-specific needs, special risks and budget, and enable decision-makers to find the most effective way to improve cyber-resilience. 

AI risk management solutions 

According to Tehila, Onxyia is defining a new solution category for security teams and has no direct competitors. 

“Onxyia is a proactive solution that takes internal and macro-environment factors into account. A proactive solution is necessary for security managers to have real-time insight into their cybersecurity postures and implement proactive measures to ensure business continuity. Currently most of these processes are being done manually,” Tehila said. 

Although it’s important to note that Onxyia isn’t the only provider leveraging AI to identify risks in enterprise environments. For instance, Securiti uses AI to automatically map unstructured and structured data records in real time, while providing an overview of risk scores for data risks. Securiti most recently raised $50 million as part of a series B funding round

Similarly, OneTrust also uses AI to discover and classify data, identifying at-risk data and enabling the user to monitor it with analytics displays. To date, OneTrust has raised $920 million in funding

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
AI

How Tymely aims to improve chatbot conversations

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


As digitization continues to shape consumer behavior toward ecommerce businesses, consumers are increasingly demanding fast and convenient online shopping experiences. Fueled by the COVID-19 pandemic, that demand also increased the online presence of ecommerce businesses. With more enterprises riding the digital transformation wave, positive customer experience (CX) is crucial to customer acquisition and improving sales.

In 2021, Vonage listed chatbots (40%) as the second most preferred communication channel for consumers. Shopify’s Future of Commerce Trend 2022 Report revealed 58% of consumers purchased from brands where they’ve experienced excellent CX. The report further showed more brands (44%) plan to invest in asynchronous chat experiences to manage customer responses. Undoubtedly, many ecommerce brands are becoming more aware of the impacts of CX, and are turning to artificial intelligence (AI) tools like chatbots to improve customer service.

However, while chatbots have become a critical part of the customer journey today, issues around personalization persist. In 2019, Forrester reported that 54% of online consumers in the U.S. believed interacting with a chatbot “has a negative impact on the quality of their life.”

This implies that even though chatbots are great tools, they aren’t perfect yet. Though, Ohad Rozen, cofounder and CEO of chatbot provider, Tymely, believes that human supervision in its processes provides a solution that enables human-level personalizations.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

The company, which today announced it raised $7 million to “make AI converse better,” claims it uses AI-human hybrid tech to enable brands to provide email and chat support services in a more human, empathetic and precise way. 

Rozen told VentureBeat in an interview that in addition to Tymely’s cutting-edge natural language processing (NLP) models, the company takes the human-in-the-loop approach to close the gap between today’s current technology stack and optimal customer service.

The rise and fall of chatbots

Chatbots are AI-powered programs that provide on-demand customer services — and unlike human customer services, chatbots are always available. In 2018, Drift reported that 64% of consumers listed 24-hour service as chatbots’ most helpful feature, while 55% were impressed with its swift response. 

Although chatbots are fast and readily available, creating personalized messages is still a blocker. This is because of their inability to comprehend the nuanced industry-specific languages customers use. WATConsult’s 2021 research adds more weight to this stance, revealing the main blockers to using chatbots are lack of understanding (50%), inability to solve complex issues (47%), and lack of personal service experience (45%). 

According to a report by Gartner, Chatbots’ self-service report is also statistically underwhelming. The report showed chatbots’ self-service solves only 9% of queries without a human touch. Besides, chatbots have limited use for customer engagements, and chatbots with poor customer service output are bad news for sales. For example, chatbots caused sales to decline by 80% in 2019

Because of its limited customer service functionality, many companies are slow to adopt the technology. For instance, fashion retailer Everlane ditched the Facebook Messenger chatbot after it recorded high failure rates in 2017. Along those same lines, in 2018, Accenture reported that 53% of organizations “have no plans” to invest in chatbots. 

Tymely intervention

Tymely claims its AI technology can create personalized messages. Launched in 2022, the company says that it is building an AI that understands complex human language to improve CX. Unlike most chatbots and other fully automated solutions, Tymely claims it has a human-level understanding of the customers’ language, with its technology being a mix of people and AI. 

Rozen also believes the human touch is the answer to creating empathetic messages that regular chatbots lack.

“Tymely employs experts that review each AI input and, if needed, correct it in real-time. This results in human-level accuracy that enables us to understand tiny and implicit nuances in a customer’s text; a high-resolution understanding that also allows us to generate hyper-personalized and empathetic responses to customers, according to the brand’s voice and policy,” he said.

Rozen also noted that Tymely can improve the efficiency of contact centers because it’s fully digital, helping businesses save labor-head costs. He further noted that Tymely AI costs 50% to 80% less than outsourced contact centers. “And unlike contact centers, Tymley commits to SLA in minutes, not hours,” he added.

This new funding boost was led by venture capital firm Hetz Ventures and DESCOvery, the D. E. Shaw group’s venture studio. In a statement announcing the funding, Rozen revealed that Tymely plans to use the funding to “improve its natural language understanding (NLU) technology” for better service offerings.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
AI

Ironclad’s new contract platform embeds AI to improve business workflows

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Ironclad yesterday unveiled a new version of its contract platform embedded with an AI layer in an effort to improve business workflows throughout the lifecycle of a contract.

Organizations can create contracts 60% faster by automating the contract creation process, according to Jason Boehmig, the company’s CEO and co-founder. They will also have the capability to “slice and dice” all the operational data in previously executed contracts, he said. 

“They have a whole mountain of contracts that existed before they worked with us’’ that are static PDFs, Boehmig told VentureBeat. Ironclad Smart Import uses optical character recognition (OCR) to convert PDF files to DOCX when editing documents. The software scans, indexes, tags, and stores contract data at scale. The new platform is designed to make contracts full-text searchable, automate data extraction, and extract key terms — such as renewal dates. 

This way, a company’s customer support team can reach out to a customer to see if they are going to renew a contract, Boehmig explained. “If you miss that, you lose revenue. “Now, [contracts] … are fully living, breathing documents because of the AI analysis that went into tagging them and making them searchable.”

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

The AI functionality also makes it possible to recognize who needs to approve a contract and automatically route it to them, he said. 

The new world of contract creation

Also yesterday, Ironclad launched in beta Ironclad Playbooks, which uses AI-powered clause detection so customers can review and negotiate contracts. Playbooks is designed to automatically analyze contracts and flags areas that require a thorough review and provide suggestions on how to negotiate based on legal-approved guidelines, the company said.

Contract creation in the “old world looks like a human with a checklist reading contracts line by line and checking off boxes and making sure every sentence complies with the checklist,” Cai GoGwilt, co-founder and CTO at Ironclad, told VentureBeat. That person would have to do negotiations using a redlining process and go back and forth over email or write things out and scan them in, he added. “The new world is accelerating that” using AI to “intelligently negotiate and review contracts at scale.”

The software scans every part of the contract and matches it to the organization’s preconfigured playbooks and tracks whether every line in the playbook is in compliance and suggests language that can be swapped in, GoGwilt said. “It’s contextual and empowers the user to make better decisions more quickly.” 

“When you’re dealing with 20- to 30-page vendor contracts, the manual review process takes a massive amount of time – but it’s critical work,” said Charles Hurr, associate general counsel at L’Oréal, in a statement. “Ironclad AI automatically reviews these contracts, flags language and clauses that don’t work for us, and suggests L’Oréal-approved provisions to swap in.”

This cuts the review process from hours to minutes, Hurr said, and improves his team’s efficiency, freeing up time for people to focus on more high-impact work.

“Our goal is to keep legal out of 95% of our contracts, and Ironclad’s AI-driven workflows, permission controls, and analytics get us there,” Catherine Choe, director of legal at Everlaw, said in a statement. “Ironclad has helped our team facilitate growth by dramatically speeding up the contract upload and review process, all while maintaining compliance and mitigating risk.”

AI and analytics

Ironclad said the new AI tools come on the heels of the release of Ironclad Insights, a contract analytics and visualization platform. Because Ironclad automatically captures both metadata and process data, Insights is designed to let users create visualizations of crucial operational and business data to make faster decisions, pinpoint bottlenecks, and present findings in a digestible way for key stakeholders.

Pricing for the new platform is based on the number of users, Ironclad said.
Earlier this year, Ironclad announced it had raised $150 million in Series E financing from Franklin Templeton, a global investment management firm, bringing its total financing to $333 million.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
Computing

How DDR5 can improve PC gaming and still be a useless upgrade

DDR5 — it’s all PC gamers can take about now that AMD Ryzen 7000 is about to launch. Although Intel has supported DDR5 since the launch of its 12th-gen Alder Lake processors, Ryzen 7000 is the catalyst that will kill last-gen DDR4 off for good. When you next upgrade your PC, you’ll need DDR5, but paying up for a faster kit of memory may not translate into real-world performance gains.

One of the best DDR5 kits will still offer a great gaming experience, but the delicate balance of speed and latency puts high-end DDR5 in a precarious position. On one hand, faster DDR5 can offer practical differences in some games, but on the other, even faster kits can result in lower performance. And in some games, RAM speed doesn’t matter at all.

It sounds like a lot, and it is. But stick with me, and I’ll help you understand what to look for when buying your first DDR5 kit so you can get past misleading marketing and have the best gaming experience possible.

Beyond speed — A primer on RAM speed

Jacob Roach / Digital Trends

RAM is confusing. It has specific nomenclature that isn’t applicable to other components in your system, plus loads of specificity. The term “RAM” isn’t even a correct phrase to use, as it’s colloquially used to describe DRAM despite the fact that it’s an umbrella term that can also apply to other types of RAM.

Gamers mostly care about the main spec marketed on RAM kits: frequency. Frequency is actually the data rate of the memory (expressed as megatransfers per second), but you can still think of it as frequency like you would with a CPU or GPU. This is one of the few specs that RAM shares with your other components, and you’d be forgiven for thinking that a higher frequency means faster speed. That’s not always the case, though.

Going a level deeper, you have the CAS latency of the RAM (you’ll see it as “CL38” or something similar on DDR5 boxes). This is the time it takes for the RAM to hand off data when called upon by the CPU, and lower latencies are preferred. Using the CAS latency and data rate, you can calculate the real-world latency by dividing the CAS latency by the data rate, then multiplying by 2,000. DDR4-3200 memory with a CAS latency of 14, for example, has a latency of 8.75 nanoseconds.

Memory speed matters for gaming. By how much is a different question.

It’s important to take both the data rate and CAS latency into account when choosing your RAM kit, as a faster, more expensive kit on paper can actually result in identical performance to a cheaper kit. I’ll dig into that more in the next section, so hold tight.

In the early days of a new memory standard like DDR5, you’re usually looking at high latencies so companies can market high speeds (regardless of how that balance actually shakes out for performance). That’s not the case with DDR5, though. It has already matured enough for the bandwidth improvements to overtake the higher latencies compared to DDR4. That means the speed of your DDR5 actually matters for gaming. By how much, though, is a different question.

DDR5 and gaming in the real world

A test bench with DDR5 installed.
Jacob Roach / Digital Trends

With the technobabble out of the way, let’s get to the fun stuff: benchmarking. I tried out three DDR5 kits at different data rates and latencies to see if there was a difference in gaming performance, and I found a few differences in some titles.

I ran all of my tests with an Intel Core i9-12900K, 12GB Nvidia RTX 3080, and an MSI Z690 Carbon Wi-Fi motherboard. The relationship between the CPU and RAM is critical, so keep in mind that DDR5 performance with AMD’s Ryzen 7000 processors might be different from what I found here. It should be largely similar to previous AMD generations, though, with a similar architecture at the helm.

I tested four speeds: DDR5-4800 CL38, DDR5-5200 CL38, DDR5-6000 CL40, and DDR5-6200 CL42.

eDDR5 speeds in various games.

Most, if not all, DDR5 kits will run at 4800MHz out of the box, so you’ll need to turn on either Intel Extreme Memory Profile (XMP) or AMD’s new EXPO standard for the advertised speed. Toggling on the DDR5-5200 kit didn’t provide an immediate uplift. I actually saw slightly slower results in Gear Tactics, but F1 2022 saw a solid 3% improvement.

That trend continued into DDR5-6000, with Gears Tactics jumping by 3% and F1 2022 by an additional 8%. That quickly changed with the PNY XLR8 Gaming DDR5-6200 kit, though. I actually saw lower performance in Gears Tactics and an identical result in F1 2022. Then there was Red Dead Redemption 2, which I’ll circle back to it in a moment.

I should note that this PNY kit was the only one with three XMP profiles on board, including a DDR5-5600 option that matched my DDR5-6000 results.

PNY XLR8 Gaming memory sitting on a mouse pad.
Jacob Roach / Digital Trends

If you put on your math cap for a moment, these results make sense. The DDR5-6000 kit actually has the lowest real-world latency out of the three, so it’s giving the best result out of these kits. Lower latency isn’t inherently better across memory generations, so always take it incontext of bandwidth. Solid DDR4-3200 CL14 kits still have lower latency overall, but the decreased bandwidth means they aren’t as performant as DDR5 kits with higher latency.

For the upcoming generation, the balance of CAS latency and speed is important so you don’t buy a kit you don’t need. In the previous generation, we saw this dynamic between DDR4-3200 CL16 and DDR4-3600 CL16. The 3600MHz kit is faster, sure, but the latency is identical. That means spending up for a DDR4-3600 kit would effectively be wasted money.

That’s what we’re seeing here with DDR5-6000 and DDR5-6200. With nearly identical latency, you’ll see the same or slightly lower performance with the faster kit, especially depending on the memory dies inside.

Just like raw data rate, you shouldn’t put all the value on latency. Red Dead Redemption 2 is a prime example that the games you play are the largest influence on performance. I saw no difference across the four DDR5 speeds tested, showing that this game isn’t particularly sensitive to memory speed (it’s GPU bound in many cases, as you’ll typically find with modern games).

Which DDR5 kit should you buy?

Corsair Dominator Platinum memory installed in a PC.
Jacob Roach / Digital Trends

Although it’s interesting to look at DDR5 speeds and how they impact gaming performance, that raw data doesn’t tell you which kit to buy. After all, you can’t just buy a bunch of RAM kits to find the best speed for the games you play.

The only place to start is price. There isn’t any real difference between the base DDR5-4800 kits you’ll find and DDR5-5200. A 2x16GB kit of Corsair Vengeance DDR5-4800 is $150 and the DDR5-5200 is only $2 more. Big deal. There are some DDR5-5600 kits between $160 and $200, but most kits go toward RGB lighting instead of speed as prices climb.

If you want to jump to DDR5-6000, you’ll spend $220 at minimum for the same capacity. Beyond that, anything is fair game, with some kits jumping up to $300 or above. Clearly, an extra $70 isn’t going to buy you extra performance in every game, but as DDR5 prices continue to drop, we’ll likely see pricing tighten up among different speeds.

The DDR5 memory you buy all comes down to the games you play.

So, which DDR5 kit should you buy? At current prices, a solid DDR5-5200 kit is all most people need, offering a solid bump in CPU-limited games without costing much more than DDR5-4800. AMD says the sweet spot for upcoming Ryzen 7000 CPUs is DDR5-6000, though, so I’ll revisit this topic once we have new CPUs in hand. For now, you don’t need to climb that high.

As is always the case with PC hardware, it all comes down to the games you play. As I wrote about in my last entry on upgrading your gaming CPU, you can learn a lot about what components you should upgrade by analyzing the games you play. A big open-world game like Red Dead Redemption 2 may not care about faster memory, but a CPU-limited title like F1 2022 favors it a lot. At the end of the day, the games you want to play should be your touchstone for making informed PC upgrades.

This article is part of ReSpec – an ongoing biweekly column that includes discussions, advice, and in-depth reporting on the tech behind PC gaming.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Computing

These secret Finder settings will vastly improve your Mac

If you own a MacBook Air, MacBook Pro, or any other Apple device that runs MacOS, then you probably know that the Finder is the core experience. It is the home to navigating your files, apps, and other things that you use on a day-to-day basis.

But did you know that you don’t have to stick with how Apple has set up Finder, right out of the box? Like many other things in MacOS, the Finder experience is actually highly customizable. We wanted to highlight some of the top MacOS tips all centered around Finder, and how to make it more useful.

Change what new Finder windows show

Arif Bacchus/ Digital Trends

By default, new Finder windows like to show you your most recent files. This isn’t always convenient, as if you download a lot of files, or have a lot of documents, your Finder window will be a huge mess. That’s why you can swap out which folder new Finder windows show by default to something else, like documents, or desktop.

To do this, click a new Finder window. Then, visit the Finder option in the menu bar at the top of your screen. Choose Preferences and then under New Finder Windows show, pick the folder you want. This way, you can get access to the folder you want, and not see a list of files.

Enable hard disks on the desktop

Showing Hard Disks on the Windows Desktop
Arif Bacchus/ Digital Trends

Next up, is a tip to avoid having to use Finder manually in a specific case. If you enable dark disks on the desktop, you can see all the disks that you have attached to your Mac. That includes network drives or external hard drives. This makes it easier than having to click to first open Finder and then click in the sidebar to get to disks.

Again, you just need to open a new Finder window. Then, visit Finder in the menu bar at the top of your screen. Choose Preferences. Make sure that the Hard disks option is selected under Show these items.  You also might want to enable the other items in this list, too. These include external disks, CDs or DVDs or iPhones, and serves.

Tweak the Sidebar to enable what you want to see

Sidebar options in the Finder
Arif Bacchus/ Digital Trends

The sidebar in the Finder gives you access to common areas of MacOS like applications, downloads, documents, and more. If you don’t use some of those items and want to add other locations, though, then you can tweak those.

Just visit the Finder preferences, and then click on the Sidebar menu. There are a ton of items that you can add, like AirDrop, Movies, Pictures, iCloud Drive, and so much more! Click to check the ones you want to enable, and uncheck the ones you don’t need!

Show your filename extensions

Showing Filenames in the Finder in MacOS
Arif Bacchus/ Digital Trends

There’s probably a chance that you’re using a lot of different apps on your Mac. Each of those apps has different file extensions. Examples include .doc for Word documents,.PDF for PDF files, and .PSD for Photoshop. MacOS does a good job of showing you previews of those file types, but you might also want to enable filename extensions so you have a better idea of what you’re dealing with.

This setting option is found under the Advanced section of Finder Preferences. From there, you can enable filename extensions by clicking on Show all filename extensions. Once you do, the finder will change the way files are shown. It will now show you some of the examples and extensions we mentioned above whenever a file is listed.

Customize your Finder toolbar and add Airdrop

Customizing the Toolbar in Finder
Arif Bacchus/ Digital Trends

See the icons at the top right of new Finder windows? You can actually customize those and add different options, including a shortcut to Airdrop. This makes your life a lot easier when sharing files, but you also can add other quick actions like creating new folders, burning CDs or ejecting CDs, and more.

To tweak this Finder option, all you need to do is visit the menu bar after opening a new Finder window. This time, though, go to View and then Customize Toolbar. You’ll then see different icons you can enable. Click on the icons, and drag them into your new Finder window! Simple as that! If you want, you even can change the option under Show so that your view will include icons, as well as text, for a better understanding of what the icon does.

We do have a note on the AirDrop option, though. If you opt to drag it into your Toolbar, it only will actually work when files are selected. Otherwise, it will show gray. Once the files are selected, you can click the AirDrop icon and send your files over to an iPhone, iPad, or other Apple device.

Add the Path Bar and Status Bar to navigate quickly

The sidebar and the status bar in MacOS
Arif Bacchus/ Digital Trends

Do you often lose your place when navigating files and folders in Finder? Well, while it’s turned off by default, you can make your life easier by showing the Path Bar and Status Bar. To do this, just click on the menu at the top of your screen when FInder is open, and choose View followed by Show Path Bar and Show Status Bar. This brings up a directory of sorts at the bottom of your Finder. You’ll be able to see file storage and even change the icon for files. Pretty cool, huh?

Change view options and add wallpapers to folder windows

Pictures in the MacOS Sidebar
Arif Bacchus/ Digital Trends

Our last Finder setting to tweak is actually one of the coolest. You can change your view options from within Finder to add different wallpapers to different folders in the Finder background. This can help you set folders apart more easily.

Like our other tips, this one’s easy, too. Just open a new Finder window, click to the top and choose View followed by Show View Options. From there, look under Background, make sure Picture is selected, and drag a file to use as a background. You also can change the color to a specific one, too. It’s simple and easy, and the possibilities for customization are endless.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Computing

Razer Blade firmware update could improve GPU performance

Razer is introducing two new configurations to its Blade 17 series gaming laptops for 2022, as well as bumping performance on some existing units through a firmware update.

The new Razer Blade 17 models will now include options for up to Intel Core i9-12900H processors and Nvidia GeForce 3070 Ti GPUs.This hardware setup has never before been seen on the Blade 17 laptops, as Razer pointed out.

Paired with a display upgrade for the entry model, the 17.3-inch QHD display will now come with a 240Hz refresh rate instead of 165Hz. It also boasts 16GB DDR5 RAM and it will start at $3,400. The 17.3-inch UHD display features a 144Hz refresh rate and 32GB DDR5 RAM, along with a starting price of $3,800.

Several models of the 2022 laptop that were released earlier this year are also available, with the cheapest version starting at $2,700. Like other models, the latest Blade 17 sells with a two-year battery warranty.

Razer also claims the 2022 Blade 17 models running 3060 and 3080 Ti GPUs will receive a boost in total graphics power (TGP) via an upcoming firmware update. Benefits that can be expected from the update include an additional 10 watts of system performance power for each system. The Blade 17 with the RTX 3060 GPU can expect a maximum TGP of 130 watts (115 watts + 25 watts) and the Blade 17 with the 3080 Ti a maximum of 175 watts (150 watts + 25 watts).

Models newer than these are set to release with increased TGP from production.

In addition to the Blade 17, Razer has also been showcasing its Blade 15 and Blade 14 laptops throughout the year. The Blade 15 is somewhat similar to the Blade 17 with many display options and a power focus. The Blade 14 has a thin and light ultrabook design with AMD processor options, in addition to Intel.

Razer is also offering a limited-time deal, giving those who buy a Razer Blade 17 or Razer Blade 15 with a 12th-gen Intel processor a free 6-month subscription to the Vegas Post 365 video production bundle or 40% off a one-year subscription. The deal is available until July 31.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Game

Niantic buys gameplay recording app Lowkey to improve its in-game social experience

Niantic has acquired another company to help build out its augmented reality platforms. The company has announced that it’s acquiring the team behind Lowkey, an app you can use to easily capture and share gameplay moments. While you can use any screen capture application — or even your phone’s built-in feature — to record your games, Lowkey was designed with casual gamers or those who don’t want to spend time editing their videos in mind. 

The app can capture videos on your computer, for instance, and sync them with your phone where you can use its simple editing tools to create short clips optimized for mobile viewing. You’re also able to share those clips with friends within the app Snapchat-style or publish it for public viewing like TikTok. Niantic didn’t reveal what the Lowkey team will be doing for its AR games and experiences exactly, but it said the team’s “leadership in this space will accelerate the social experiences [it’s] building in [its] products.” The company added: “We share a common vision for building community around shared experiences, and enabling new ways to connect and play for our explorers.”

The Pokémon Go creator purchased other companies in the past in its quest to build more tools and features for its augmented reality products. In 2017, it purchased social animation startup Evertoon to build a social network for its games. Last year, it bought 3D mapping startup 6D.ai to develop “planet-scale” augmented reality, and just this August, it acquired LiDAR scanning app Scaniverse to create a 3D map of the world.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Repost: Original Source and Author Link

Categories
AI

How Adobe uses deep learning to improve its products

Like every year, Adobe’s Max 2021 event featured product reveals and other innovations happening at the world’s leading computer graphics software company.

Among the most interesting features of the event is Adobe’s continued integration of artificial intelligence into its products, a venue that the company has been exploring in the past few years.

Like many other companies, Adobe is leveraging deep learning to improve its applications and solidify its position in the video and image editing market. In turn, the use of AI is shaping Adobe’s product strategy.

AI-powered image and video editing

Sensei, Adobe’s AI platform, is now integrated into all the products of its Creative Cloud suite. Among the features revealed in this year’s conference is an auto-masking tool in Photoshop, which enables you to select an object simply by hovering your mouse over it. A similar feature automatically creates mask layers for all the objects it detects in a scene.

The auto-mask feature saves a lot of time, especially in images where objects have complex contours and colors and would be very difficult to select with classic tools.

Adobe has also improved Neural Filters, a feature it added to Photoshop last year. Neural Filters use machine learning to add enhancements to images. Many of the filters are applicable to portraits and images of people. For example, you can apply skin smoothing, transfer makeup from a source image to a target image, or change the expression of a subject in a photo.

Other Neural Filters make more general changes, such as colorizing black-and-white images or changing the background landscape.

The Max conference also unveiled some preview and upcoming technologies. For example, a new feature for Adobe’s photo collection product called “in-between” takes two or more photos that were captured at a short interval of each other, and it creates a video by automatically generating the frames that were in-between the photos.

Another feature being developed is “on point,” which helps you search Adobe’s huge library of stock images by providing a reference pose. For example, if you provide it with a photo of a person sitting and reaching out their hand, the machine learning models will detect the pose of the person and find other photos where people are in similar positions.

AI features have been added to Lightroom, Premiere, and other Adobe products as well.

The challenges of delivering AI products

When you look at Adobe’s AI features individually, none of them are groundbreaking. While Adobe did not provide any architectural or implementation details in the event, anyone who has been following AI research can immediately relate each of the features presented at Max to one or more papers and presentations made at machine learning and computer vision conferences in the past few years. Auto-masking uses object detection and segmentation with deep learning, an area of research that has seen tremendous progress recently.

Style transfer with neural networks is a technique that is at least four years old. And generative adversarial networks (GAN), which power several of the image generation features, have been around for more than seven years. In fact, a lot of the technologies Adobe is using are open source and freely available.

The real genius behind Adobe’s AI is not the superior technology, but the company’s strategy for delivering the products to its customers.

A successful product needs to have a differentiating value that convinces users to start using it or switch from their old solutions to the new application.

The benefits of applying deep learning to different image processing applications are very clear. They result in improved productivity and lower costs. The assistance provided by deep learning models can help lower the barrier of artistic creativity for people who don’t have the skills and experience of expert graphical designers. In the case of auto-masking and neural filters, the tools make it possible even for experienced users to solve their problems faster and better. Some of the new features, such as the “in-between” feature, are addressing problems that had not been solved by other applications.

But beyond superior features, a successful product needs to be delivered to its target audience in a way that is frictionless and cost-effective. For example, say you develop a state-of-the-art deep learning–powered neural filter application and want to sell it on the market. Your target users are graphic designers who are already using a photo-editing tool such as Photoshop. If they want to apply your neural filter, they’ll have to constantly port their images between Photoshop and your application, which causes too much friction and degrades the user experience.

You’ll also have to deal with the costs of deep learning. Many user devices don’t have the memory and processing capacity to run neural networks and require cloud-based processing. Therefore, you’ll have to set up servers and web APIs to serve the deep learning models, and you also have to make sure your service will remain online and available as the usage scales. You only recoup such costs when you reach a large number of paying users.

You’ll also have to figure out how to monetize your product in a way that covers your costs while also keeping users interested in using it. Will your product be an ads-based free product, a freemium model, a one-time payment, or a subscription service? Most clients prefer to avoid working with several software vendors that have different payment models.

And you’ll need an outreach strategy to make your product visible to its intended market. Will you run ads on social media, make direct sales and reach out to design companies, or use content marketing? Many products fail not because they don’t solve a core problem but because they can’t reach out to the right market and deliver their product in a cost-efficient manner.

And finally, you’ll need a roadmap to continuously iterate and improve your product. For example, if you’re using machine learning to enhance images, you’ll need a workflow to constantly gather new data, find out where your models are failing, and finetune them to improve their performance.

Adobe’s AI strategy

Adobe already has a very large share of the graphics software market. Millions of people use Adobe’s applications every day, so the company has no problem in reaching out to its intended market. Whenever it has a new deep learning tool, it can immediately use the vast reach of Photoshop, Premiere, and the other applications in its Creative Cloud suite to make it visible and available to users. Users don’t need to pay for or install any new applications; they just need to download the new plugins into their applications.

The company’s gradual transition to the cloud in the past few years has also paved the way for a seamless integration of deep learning into its applications. Most of Adobe’s AI features run in the cloud. To its users, the experience of the cloud-based features is no different than using filters and tools that are directly running on their own devices. Meanwhile, the scale of Adobe’s cloud makes it possible for the company to run deep learning inference in a very cost-effective way, which is why most new AI features are made available for free to users who already have a Creative Cloud subscription.

Finally, the cloud-based deep learning model provides Adobe with the opportunity to run a very efficient AI factory. As Adobe’s cloud serves deep learning models to its users, it will also gather data to improve the performance of its AI features in the future. For example, the company acknowledged at the Max conference that the auto-masking feature does not work for all objects yet but will improve over time. The continued iteration will in turn enable Adobe to enhance its AI capabilities and strengthen its position in the market. The AI in turn will shape the products Adobe will roll out in the future.

Running applied machine learning projects is very difficult, which is mostly why companies fail in bringing them to fruition. Adobe is an interesting case study of how bringing together the right elements can turn advances in AI into profitable business applications.

Ben Dickson is a software engineer and the founder of TechTalks. He writes about technology, business, and politics.

This story originally appeared on Bdtechtalks.com. Copyright 2021

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Notable aims to improve AI in health care with new $100M

This article is part of a VB special issue. Read the full series: AI and the future of health care


Notable, an intelligent automation company focused on health care, today announced it received a $100 million series B funding round. The investment, led by ICONIQ Growth with participation from Greylock Ventures, Oak HC/ FT, and F-Prime, will be used to expand access to more health care providers and enhance its capabilities, so partners achieve a higher return on investment.

The reality is that many health care providers still use repetitive, manual workflows, which cost over $1 trillion in administrative overhead per year. A patient may spend seven minutes with a physician – but that visit could result in hundreds of minutes of administrative work per clinician, according to Pranay Kapadia, cofounder, and CEO of Notable. Using AI, Notable can eliminate more than 700 minutes of that administrative work, including creating clinical documentation and adding billing codes for the insurance claim processing.

The investment points to a larger industry trend toward using AI to improve patient care and streamline processes. Care sites like Intermountain Healthcare and CommonSpirit Health already use Notable, which automates everything from patient scheduling and check-in to post-visit follow-up, as well as creating clinical documentation and adding billing codes.

Demand for AI continues to increase as patients expect a digital-first experience due to the COVID-19 pandemic, as well as the “great resignation” that has left every industry — including health care — short-staffed. “Technology needs to drive ten times the efficiency at a quarter of the cost,” said Kapadia.

“Technology is the future of everything, and health care is no exception,” said Andrew J. Scott, founding partner of 7percent Ventures. “Artificial intelligence is already having a positive impact. Companies like Kheiron Medical can already perform mammography analysis for breast cancer better than a human.”

7percent Ventures invests in AI technology including Limbic, which uses AI for mental health triage and support, and Kherion Medical, which provides improved breast cancer diagnosis. These “are the sorts of transformative technologies that have a positive impact and improve the way we live,” he said.

Will AI Provide All Diagnoses?

Going all-in on AI in a health care setting may speed up a diagnosis – but it also takes away a physician’s autonomy in making the diagnosis and recommending treatment, according to Robert Wachter, MD, professor, and chair of the Department of Medicine at the University of California, San Francisco.

“There are a lot of sources of pushback, from the physician’s ego to worries about malpractice and who is liable, to ethical issues around AI,” such as whether the data is biased, he said. For example, the data may note that patients of one race don’t need as much medication as patients of another, without taking into account that particular patient’s situation.

AI will tackle more tractable problems like workflows before heading into the more difficult ones like diagnosis and prognosis, but there won’t be a real “AI moment,” Wachter said. “You start …where the stakes are less high, with business and operational problems.”

Instead, AI will augment what physicians are doing and provide options, including triage, but ultimately leave the decision up to the physician’s discretion.

“I see AI working silently behind the scenes of the busy clinician,” said Chris Larkin, chief technology officer at Concord Technologies. “The models will continue to gather data on patient diagnosis and trajectories and update the clinician when it’s appropriate. This is more like modern avionics, working on behalf of the pilot of the aircraft.”

For example, ICU nurses hear thousands of patient alarms on their shifts, many of which are false. AI can help the nurses decide which ones are most pressing based on the patient’s diagnosis and attend to them first, Larkin said.

Some clinicians already are using AI and machine learning exactly this way. “I’ve used VIDA Insights as an AI agent to assist me in interpreting chest CTs,” said John Newell, MD, professor of radiology and biomedical engineering, director of the Radiology Image Phenotyping Laboratory, and the co-director of the Iowa Institute for Biomedical Imaging.

Additionally, AI can help lower costs for both patients and health care organizations while providing better care. “If AI can help us to diagnose disease earlier and with more accuracy, the impact on reducing the cost of patient care can be significant,” Newell said.

“For example, a patient with early-stage COPD spends about $1,600 [per] year on care versus a patient with advanced-stage COPD who spends nearly $11,000 [per] year. COPD is often diagnosed later in the disease process, so any tools that can help providers identify it early can have a massive impact on population health care costs.”

Despite the opportunities AI provides for the health care industry, humans will always be needed — and AI doesn’t aim to entirely displace them. “With all the AI in the world, [there’s still] a certain level of empathy that comes in health care,” Kapadia said, noting that, like comforting a child with a sore throat, AI isn’t needed for that.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Game

Microsoft buys Two Hat to improve Xbox community moderation

On Friday, Microsoft , a company best known for its AI content moderation tools. Financial details have not been disclosed, but Microsoft did share its vision for how they’ll work together moving forward. Over the years, the two companies have frequently collaborated to make Xbox Live and other gaming communities safer, and by the sounds of it, that will be the focus of Two Hat moving forward.

“We have partnered with Xbox and the Microsoft team for several years and share the passion and drive to make meaningful change in the advancement of online civility and citizenship,” said Two Hat founder Chris Priebe and CEO Steve Parkis in a . “We are committed to ensuring safety, inclusion and online health and wellness are always at the forefront of our work and through joining Microsoft, we can provide the greatest concentration of talent, resources and insight necessary to further this vision.”

Before today’s announcement, Microsoft was only one of Two Hat’s customers, and that won’t change following the acquisition. “This is a deep investment in assisting and serving Two Hat’s existing customers, prospective new customers and multiple product and service experiences here at Microsoft,” the company said. “With this acquisition, we will help global online communities to be safer and inclusive for everyone to participate, positively contribute and thrive.”

Since 2019, Microsoft has placed an emphasis on . “Gaming is for everyone,” Xbox chief Phil Spencer said at the time. This acquisition should tie in nicely with that goal.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Repost: Original Source and Author Link