Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
NVDA
#1
Artificial intelligence is pushing into every area of our life with deep inroads to our mobile devices, our video games, and even our own brains. Video surveillance could the next step if NVIDIA's new video analytics platform, Metropolis, is successful. The initiative, announced just ahead of the annual GPU conference this week, will use learning AI to analyze the massive amount of data from surveillance video for "public safety, traffic management and resource optimization."
According to NVIDIA, there are already hundreds of millions of surveillance cameras around the globe, with the number expected to rise to the 1 billion mark by 2020. Human beings have a hard time sifting through the flood of moving images, storing the majority of it on hard drives for later viewing. NVIDA thinks that deep learning AI can help video analytics much more accurately than humans or even real-time computer monitoring. The company has partnered with more than 50 companies that make security cameras, including Hikvision. "The benefit of GPU deep learning is that data can be analyzed quickly and accurately to drive deeper insights," said Shiliang Pu, president at the Hikvision Research Institute in China.

NVIDIA's AI may keep watch over smart cities of the future

There were no updates on the financial outlook from Nvidia management, but the company did talk up what it believes are $70 billion worth of addressable markets for its products come 2020, and analysts seem encouraged by that today. As one analyst, Rosenblatt’s Hans Mosesmann, crowed today, "The level of disruption that the Nvidia GPU acceleration approach will do to the computing world will be epic in our opinion."

Among the intriguing topics that emerged amidst everything else was that of “inferencing.” As I discussed in a November interview with Jen-Hsun Huang, Nvidia’s CEO, inferencing is the corollary of what happens in today’s machine learning. In machine learning’s “training” phase, which is where Nvidia is getting most of its GPU contracts, computing facilities learn how to understand information by sifting through large data sets.

Inferencing is when the machine responds to an actual query from an internet user by applying the rules it has learned in training phase. Huang indicated to me in November that having his chips used for inferencing would spread their adoption to a much broader set of uses, where just about every Web page request could become a kind of inference.

Nvidia Up Another 5%: Bulls Delighted Pondering ‘Epic Disruption’ - Barron's

Reply

#2
That’s why Nvidia, the processor company whose graphics processing units (GPUs) are powering much of the boom in deep learning, is now focused on the edge. Deepu Talla, vice president and general manager of the company’s Tegra business unit, says bringing AI technology to the edge will make a new class of intelligent machines possible. “These devices will enable intelligent video analytics that keep our cities smarter and safer, new kinds of robots that optimize manufacturing, and new collaboration that makes long-distance work more efficient,” he said in a statement.

Nvidia wants AI to Get Out of the Cloud and Into a Camera, Drone, or Other Gadget Near You - IEEE Spectrum

Reply

#3

E-sports are gaining popularity rapidly, a boon to Nvidia's bottom line. On the post-earnings conference call, its CEO nodded to the gaming sector's positive impact on business.

"With apologies to the start of the baseball season, E-sports is now as popular among the U.S. male millennials as American's favorite pastime," Huang said. "More people will be gaming than HBO, Netflix, ESPN and Hulu combined."

And with Nvidia client Electronic Arts beefing up its new Star Wars Battlefront II game for a November release, Cramer has no doubt that this trend will accelerate.

The next group of drivers are Nvidia's clients in the professional visualization space, more commonly known as the virtual images companies use to let customers preview their products.

"Nvidia's chips are being used by everybody, from Lockheed Martin for reliable virtual reality for the U.S. Navy to Ikea," Cramer said.

With the new administration promising to bolster the U.S. military and e-commerce sales on the rise, the forecast for that area of Nvidia's business looks sunny as well.

Nvidia's third booster comes from the data center space, where its business is now three times larger than it was one year ago, Cramer said.

"Amazon Web Services, Facebook, [Alphabet's] Google, IBM and Microsoft all rely on their chips — who else is there — to power their cloud platforms because of Nvidia's artificial intelligence prowess," he added.

Fourth but not least is Nvidia's burgeoning auto business, which grew at 24 percent last quarter. The company's products are now in 225 car and truck models, and Nvidia's recent partnerships with Bosch, the world's biggest auto supplier, and Paccar, a leading truck manufacturer, show the company making definitive forays into the autonomous vehicle space.

"Huang made it clear that because of the Amazon effect, there is a shortage of professional drivers coming. You'll need autonomous cars and trucks to power everything from shuttles and vans to pizza delivery in the not too distant future," Cramer explained. "You could understand from this call why Intel had to buy Mobileye. More chips for autonomous cars. Without it, Nvidia could leave the world's largest chipmaker behind."

Cramer walked away from the conference call with confidence that Nvidia is a leader in these monumental areas that position it to seriously benefit from growth in products like Amazon's Echo and Alexa and industries like gaming, which now serves 30 million people.

Cramer lists 4 things propelling Nvidia's booming business

Reply

#4
Hari, reiterating a Buy rating, writes that "Given increased confidence in the growth trajectory and its sustainability,” the stock now deserves a multiple o f 31 times projected earnings for 2019 (fiscal year ending January), versus the 25 times Hari had previously used. Hari notes this is “not necessarily” a valuation increase, because the already trades for 37.9 times calendar 2018 estimates.

Among details of the presentation, Hari was particularly pleased that Nvidia discussed how its chips for machine learning in data centers could be used not just for “training” but for “inferencing”: "the 2020 TAM forecasts provided by the company (i.e., ~$30bn for HPC/training/inferencing or nearly 8x our FY20 Datacenter revenue forecast of $3.6bn),” writes Hari. “We note that the consensus view among investors ahead of the Investor Day was 'while NVDA has a strong foothold in the training market, they are behind the curve relative to CPUs/FPGAs/ASICs in inference,’” writes Hari..

Nvidia: Take Advantage of Short-Sighted Investor Worries, Says Goldman - Barron's

Reply

#5
In the last five years, three technology breakthroughs NVDA refers to as the "Big Bang of AI" have transformed certain AI technologies - in particular vision - from impractical to commercially viable. Investors are broadly aware of two of these breakthroughs - massively parallel processing (provided by NVDA and others), and massive data storage (provided by webscale companies). We highlight an under-appreciated third breakthrough of the Big Bang - improved algorithms - that emerged in 2012 and 2015. Stein goes through a whole history of algorithms: "beginning in 2006 researchers developed three major breakthroughs in regards to the algorithms used in deep learning: Restricted Boltzmann Machines (RMBs), Deep Belief Networks (DBNs), and Convolutional Neural Networks (CNNs).” He goes into each in detail. He comes back at the end of the report to point out Nvidia is the “key semiconductor and development tool player” in A.I.

Nvidia: ‘We’re Not Early,’ but, ‘It’s Not Too Late,’ Says SunTrust - Barron's

Volta is the new GPU architecture   Nvidia revealed earlier this year. The new chips were promised to be such an improvement over current models that shares of the company jumped 17.8% in a single day after their announcement. AI research requires training a computer program to be as efficient as possible before it works well. This training requires multiplying matrices of data, which normally would have to be done single numbers at a time. The new Volta GPU architecture is able to multiply entire rows and columns of matrices data at once, rapidly speeding up the AI training process. Nvidia claims the new Volta architecture is 12 times faster at processing matrix multiplication than its previous "Pascal" architecture. It reduces the duration of an AI training task that used to take 18 hours to 7.4 hours, according to company data. Nvidia gave away 15 of its Volta-based Tesla V100 chips to top researchers attending the conference. The chips were some of the first ones available outside of the company, and were signed by CEO Jensen Huang.

Nvidia gave away its newest AI chips for free - and that's part of the reason why it's dominating the competition (NVDA) | 07/25/17 | Markets Insider

Reply

#6
Tesla and AMD partner on A.I. chip Tesla is getting closer to having its own chip for handling autonomous driving tasks in its cars. The carmaker has received back samples of the first implementation of its processor and is now running tests on it, said a source familiar with the matter. The effort to build its own chip is in line with Tesla's push to be vertically integrated and decrease reliance on other companies. But Tesla isn't completely going it alone in chip development, according to the source, and will build on top of AMD intellectual property. AMD shares spiked after CNBC reported that the company is working with Tesla. Shares of the stock ended the day nearly 5 percent higher and continued to climb after hours.

Tesla building an AI chip for its cars with AMD, GlobalFoundries

Why does this matter for Nvidia? Roughly 10% of NVDA's revenue comes from the graphic cards that are sold to cryptocurrency miners. Less miners equals less demand, which in turn results in lower sales for Nvidia. On the flip side, more demand equals higher sales for Nvidia. A recent analyst note from Evercore ISI basically says investors should instead be more focused on Nvidia's A.I. developments, Cramer pointed out. The market doesn't fully appreciate what Nvidia is capable of in terms of earnings and as a result, the analyst raised their price target to $250 from $180.

It's Time to Unleash Nvidia (NVDA), Jim Cramer Says - TheStreet

A ban on crypto-currencies by China’s government on Thursday will likely drive up the “mining” of bitcoin and the like, writes RBC Capital analyst Mitch Steves, in a report to clients regarding Nvidia (NVDA) and Advanced Micro Devices (AMD), which makes tools for bitcoin discovery. Banning the currencies only drives more use of the GPU chips sold by both companies in order to mine more currency, opines Steves. The ban will likely increase the demand for crypto- currency-related GPUs. With the China ban, the only way to obtain crypto currencies mined with GPUs is to now mine them with computing power (or purchase them in person from a stranger). China is now banning VPNs and straight purchases (Fiat -> Crypto), which means the best way to obtain the currency without purchasing it is to mine it using GPUs (ASICs for Bitcoin). Net Net: as noted in several reports, while the price of the coin can help drive demand, the network rate is the leading indicator. Importantly, the network rate has gone up ~2TH/s in a single day (after China news), and we think individuals looking to obtain crypto currencies in China will now purchase more computing power give that it is the only legal way to do so.

Nvidia: China Crypto-Currency Ban Only Fuels Their GPU Sales, Says RBC - Barron's

Reply



Forum Jump:


Users browsing this thread: 1 Guest(s)