After abandoning its own GPU for supercomputers, machine learning, and video games in 2009, Intel has returned to the market with a new 72-core Xeon Phi, to compete with NVIDIA’s growing portfolio of GPUs.
The Xeon Phi ‘Knights Landing’ chip, announced at the International Supercomputing Conference in Frankfurt, Germany last week, is Intel’s most powerful and expensive chip to date and is aimed at machine learning and supercomputers, two areas where Nvidia’s GPUs have flourished.
Inside the chip there is 72-cores running at 1.5GHz, alongside 16GB of integrated stacked memory. The chip supports up to 384GB of DDR4 memory, making it immensely scalable for machine learning programs.
There is no inbuilt way in Excel to insert a blank row between every existing row, but it is achievable without doing it manually.
The process is to create a column with repeated numerals (1 to n, where n is the number of rows with data) equal to twice the number of rows of data (2n) and sort by that column so blank lines appear between each data row. Here’s how…
Locate or insert a blank column
Optionally name the column “Order” or leave it blank if there are no headers
Type the number “1” in the first row of data
Fill down to the last row of data
Select and Copy all the numbers just created in that column
Paste the selection below the data into the first blank row
Sort by the column of numbers from Smallest to Largest.
Optionally delete the column of numbers
Tip: if you need multiple blank rows between, repeat step 6 as necessary
There’s a war between two visions of how the ubiquitous AI assisted future will be rendered: on the cloud or on the device. And as with any great drama it helps the story along if we have two archetypal antagonists. On the cloud side we have Google. On the device side we have Apple. Who will win? Both? Neither? Or do we all win?
If you would have asked me a week ago I would have said the cloud would win. Definitely. If you read an article like Jeff Dean On Large-Scale Deep Learning At Google you can’t help but be amazed at what Google is accomplishing. Impressive. Wide ranging. Smart. Systematic. Dominant.
Apple has been largely absent from the trend of sprinkling deep learning fairy dust on their products. This should not be all that surprising. Apple moves at their own pace. Apple doesn’t reach for early adopters, they release a technology when it’s a win for the mass consumer market.
There’s an idea because Apple is so secretive they might have hidden away vast deep learning chops we don’t even know about yet. We, of course, have no way of knowing.
What may prove more true is that Apple is going about deep learning in a different way: differential privacy + powerful on device processors + offline training with downloadable models + a commitment to really really not knowing anything personal about you + the deep learning equivalent of perfect forward secrecy.