Rumor: Intel looking to acquire Nvidia

Rumor: Intel looking to acquire Nvidia
According to recent reports, Intel has had discussions with the intent of possibly purchasing large chip maker Nvidia.

The reports claim Jen-Hsun Huang, the CEO of Nvidia, would become the head of the new merged company.



This is not the first time the companies have talked, with Intel opening discussions back in 2006 when rival AMD made their play to purchase ATI technologies. Nvidia wanted Huang to lead the merged company, even back then, and rebuffed the offer. Intel also offered to license GeForce graphics cores and integrate them into their recent "Sandy Bridge" Core i-Series but Nvidia also declined.

Intel's long-time CEO, Paul Otellini, will retire next May, and the company has been looking for a successor. Jen-Hsun Huang is a successful executive and has tons of experience in the industry, making him an ideal candidate.

Intel is a $100 billion company and Nvidia is worth about $8 billion, making a deal certainly feasible.

Written by: Andre Yoskowitz @ 16 Dec 2012 22:04
Tags
NVIDIA Merger Intel huang
Advertisement - News comments available below the ad
  • 20 comments
  • ivymike

    Did somebody say "Monopoly"???

    16.12.2012 23:24 #1

  • bob122 (unverified)

    I love that game

    17.12.2012 00:05 #2

  • xnonsuchx

    Ever since Intel's "Larrabee" flopped, Intel's been interested in NVIDIA as a quick way to avoid having to play catch-up with their lackluster GPU tech.

    17.12.2012 01:41 #3

  • KillerBug

    Originally posted by ivymike: Did somebody say "Monopoly"??? AMD has ATI for video...Intel has S**t for video...if Intel chips were not so much better than AMD chips, I'd say this was leveling the playing field.


    17.12.2012 08:01 #4

  • LordRuss

    Originally posted by KillerBug: ...if Intel chips were not so much better than AMD chips, I'd say this was leveling the playing field. I have always had a tit for tat, pseudo 'real world' usage argument over this statement for eons... not exactly worth getting into again seeing as there are so many fences to hang a hypothesis on...

    But play it safe to say that Nvidia is such a graphics force unto itself that there is no doubt they needn't pair with anyone.

    Intel (like AMD) has hit the operating threshold. So making a processor more 'capable' is what's going to bring your customers running. Hyperthreading just doesn't cut it for me. I'd rather have real cores. However, onboard video assist seems silly, other than the laptop/netbook market. That just sounds like we're getting back to making the RISC processor.

    I'm just still wondering how Intel can release a flagship processor for a G note with a straight face that Tom's Hardware is showing it to only marginally perform around the newest AMD Bulldozer at less that a third the cost.

    Tweak the software so my AMD system will perform to the Intel specs, conspiracy be damned. I'll wait the 1.5 more minutes for the movie to compress rather than spend 1K more on an Intel computer.

    http://onlyinrussellsworld.blogspot.com

    17.12.2012 10:46 #5

  • defgod

    Originally posted by LordRuss: Originally posted by KillerBug: ...if Intel chips were not so much better than AMD chips, I'd say this was leveling the playing field. I have always had a tit for tat, pseudo 'real world' usage argument over this statement for eons... not exactly worth getting into again seeing as there are so many fences to hang a hypothesis on...

    But play it safe to say that Nvidia is such a graphics force unto itself that there is no doubt they needn't pair with anyone.

    Intel (like AMD) has hit the operating threshold. So making a processor more 'capable' is what's going to bring your customers running. Hyperthreading just doesn't cut it for me. I'd rather have real cores. However, onboard video assist seems silly, other than the laptop/netbook market. That just sounds like we're getting back to making the RISC processor.

    I'm just still wondering how Intel can release a flagship processor for a G note with a straight face that Tom's Hardware is showing it to only marginally perform around the newest AMD Bulldozer at less that a third the cost.

    Tweak the software so my AMD system will perform to the Intel specs, conspiracy be damned. I'll wait the 1.5 more minutes for the movie to compress rather than spend 1K more on an Intel computer.
    My sentiments exactly!

    17.12.2012 19:43 #6

  • buddyleem

    We will do it for tegra as well

    Hack a bit, invest a bit, work a bit, jerk a bit

    17.12.2012 20:52 #7

  • bobiroc

    Originally posted by LordRuss: Originally posted by KillerBug: ...if Intel chips were not so much better than AMD chips, I'd say this was leveling the playing field. I have always had a tit for tat, pseudo 'real world' usage argument over this statement for eons... not exactly worth getting into again seeing as there are so many fences to hang a hypothesis on...

    But play it safe to say that Nvidia is such a graphics force unto itself that there is no doubt they needn't pair with anyone.

    Intel (like AMD) has hit the operating threshold. So making a processor more 'capable' is what's going to bring your customers running. Hyperthreading just doesn't cut it for me. I'd rather have real cores. However, onboard video assist seems silly, other than the laptop/netbook market. That just sounds like we're getting back to making the RISC processor.

    I'm just still wondering how Intel can release a flagship processor for a G note with a straight face that Tom's Hardware is showing it to only marginally perform around the newest AMD Bulldozer at less that a third the cost.

    Tweak the software so my AMD system will perform to the Intel specs, conspiracy be damned. I'll wait the 1.5 more minutes for the movie to compress rather than spend 1K more on an Intel computer.
    100% Agree. When it comes to real world performance AMD performs very well with most benchmarks skewing performance in one way or another. Sure Intel will win a few seconds here and a minute there on some benchmarks but I find if you compare processors at their price point between Intel and AMD you will usually find similar performance. You also have to consider more than the processor because often comparable quality and featured motherboards that take Intel chips are more expensive by as much as 50% which factors into the cost of the system. If you are building for the average user that only needs something with integrated graphics and not playing games I find that AMD APUs offer great value and very good performance.

    AMD Phenom II 965 @ 3.67Ghz, 8GB DDR3, ATI Radeon 5770HD, 256GB OCZ Vertex 4, 2TB Additional HDD, Windows 7 Ultimate.

    http://www.facebook.com/BlueLightningTechnicalServices

    17.12.2012 20:55 #8

  • KillerBug

    Originally posted by LordRuss: Intel (like AMD) has hit the operating threshold. So making a processor more 'capable' is what's going to bring your customers running. Hyperthreading just doesn't cut it for me. I'd rather have real cores. However, onboard video assist seems silly, other than the laptop/netbook market. That just sounds like we're getting back to making the RISC processor.

    ....

    Tweak the software so my AMD system will perform to the Intel specs, conspiracy be damned. I'll wait the 1.5 more minutes for the movie to compress rather than spend 1K more on an Intel computer.
    Making a processor more capable is about a lot more than just cores and clock. Hardware functions make things go much, much faster...even when the software is very well designed to work without them. Saying, "tweak the software" doesn't make the existing software better, nor does it cause developers to optimize their next gen software that is already in Alpha tests, nor does it allow heavily optimized software to run better without hardware functions.

    I do have to give AMD props finally...sorta. I had a quad core AMD for about 3 years...delayed my full overhaul for two years waiting for them to include hardware AES...finally gave up and bought an i5. Now AMD is finally adding this to a few of their chips...so I'm thinking of replacing my old Intel laptop. Why do I mention this? Because I'd love to see Cuda on budget laptops...not just for the few apps that support it now, but for the many more apps that would support it if it was commonplace.

    Oh, as for your comment about on-processor graphics being pointless except on laptops, that is a huge part of the market now. I really hate to say it, but gaming PCs are not a big part of the market. Most people just want a laptop that does basic things; screen size and price are the main things people look at (as well as color shells and Apple logos). If they really want to go big they might opt for the one with a bluray drive and a 18" screen...but video compression probably isn't even something they know about. To tell you the truth, that's all I want in a laptop (not the color shells and apple logos of course). I'm not about to spend two grand on a laptop that can't compete with a $800 desktop, so my laptop is just for basic things...I don't even need a bluray drive.


    17.12.2012 21:38 #9

  • LordRuss

    Originally posted by KillerBug:
    Oh, as for your comment about on-processor graphics being pointless except on laptops, that is a huge part of the market now. I really hate to say it, but gaming PCs are not a big part of the market. Most people just want a laptop that does basic things; screen size and price are the main things people look at (as well as color shells and Apple logos). If they really want to go big they might opt for the one with a bluray drive and a 18" screen...but video compression probably isn't even something they know about. To tell you the truth, that's all I want in a laptop (not the color shells and apple logos of course). I'm not about to spend two grand on a laptop that can't compete with a $800 desktop, so my laptop is just for basic things...I don't even need a bluray drive.
    I hear you... playing the cheap seats is just a fact of life & sometimes we're all along for the ride. I'm just one of those guys that would happen to like a bluray in his laptop. Seeing as I am still one of the crowd that gets the use out of a large screen/format.

    But now that everyone wants to go to using their laptop as a gaming system as well (which might put a small kink in your PC gaming theory), all the more reason for the move to a CPU/GPU hybrid.

    Us old goats still tend to like a machine that you can replace only the parts that are broken at any given moment as to replacing the entire machine is my complaint.

    http://onlyinrussellsworld.blogspot.com

    18.12.2012 10:06 #10

  • A5J4DX

    do it!

    18.12.2012 15:38 #11

  • ZippyDSM

    Intel would be better off buying ADM and using their tech for removable CPUs wjile making all the intel CPUs non removable.

    Copyright infringement is nothing more than civil disobedience to a bad set of laws. Lets renegotiate them.

    ---
    Check out my crappy creations
    http://zippydsm.deviantart.com/

    18.12.2012 19:27 #12

  • KillerBug

    Originally posted by LordRuss: Originally posted by KillerBug:
    Oh, as for your comment about on-processor graphics being pointless except on laptops, that is a huge part of the market now. I really hate to say it, but gaming PCs are not a big part of the market. Most people just want a laptop that does basic things; screen size and price are the main things people look at (as well as color shells and Apple logos). If they really want to go big they might opt for the one with a bluray drive and a 18" screen...but video compression probably isn't even something they know about. To tell you the truth, that's all I want in a laptop (not the color shells and apple logos of course). I'm not about to spend two grand on a laptop that can't compete with a $800 desktop, so my laptop is just for basic things...I don't even need a bluray drive.
    I hear you... playing the cheap seats is just a fact of life & sometimes we're all along for the ride. I'm just one of those guys that would happen to like a bluray in his laptop. Seeing as I am still one of the crowd that gets the use out of a large screen/format.

    But now that everyone wants to go to using their laptop as a gaming system as well (which might put a small kink in your PC gaming theory), all the more reason for the move to a CPU/GPU hybrid.

    Us old goats still tend to like a machine that you can replace only the parts that are broken at any given moment as to replacing the entire machine is my complaint.
    Yeah...we old goats do love to be able to replace just what needs to be replaced...but there is the performance/price issue as well. The average income is shrinking fast and the value of the same income is going down even faster. Many people (even myself) would love to have a $4000 gaming laptop that is slightly faster than a year-old $1000 desktop...but it just isn't an option for a lot of people. Heck, a lot of people can't even afford the $1000 desktop every other year.

    ...But that really isn't the point...because there really are a lot of people who don't game, don't do CAD, and don't even mind a 15" screen. They just want a basic system, and the only reason they upgrade their 3-year-old single-core laptop is because it needs a format/reinstall and paying someone to do it would cost almost as much as the $250 replacement they are going to buy. When these people start buying the next Celeron with on-chip nVidia Cuda, we get to see Cuda in tons of apps, and then those apps are blistering fast on our desktops that have high-end nVidia cards.


    18.12.2012 21:14 #13

  • defgod

    Originally posted by KillerBug: Originally posted by LordRuss: Originally posted by KillerBug:
    Oh, as for your comment about on-processor graphics being pointless except on laptops, that is a huge part of the market now. I really hate to say it, but gaming PCs are not a big part of the market. Most people just want a laptop that does basic things; screen size and price are the main things people look at (as well as color shells and Apple logos). If they really want to go big they might opt for the one with a bluray drive and a 18" screen...but video compression probably isn't even something they know about. To tell you the truth, that's all I want in a laptop (not the color shells and apple logos of course). I'm not about to spend two grand on a laptop that can't compete with a $800 desktop, so my laptop is just for basic things...I don't even need a bluray drive.
    I hear you... playing the cheap seats is just a fact of life & sometimes we're all along for the ride. I'm just one of those guys that would happen to like a bluray in his laptop. Seeing as I am still one of the crowd that gets the use out of a large screen/format.

    But now that everyone wants to go to using their laptop as a gaming system as well (which might put a small kink in your PC gaming theory), all the more reason for the move to a CPU/GPU hybrid.

    Us old goats still tend to like a machine that you can replace only the parts that are broken at any given moment as to replacing the entire machine is my complaint.
    Yeah...we old goats do love to be able to replace just what needs to be replaced...but there is the performance/price issue as well. The average income is shrinking fast and the value of the same income is going down even faster. Many people (even myself) would love to have a $4000 gaming laptop that is slightly faster than a year-old $1000 desktop...but it just isn't an option for a lot of people. Heck, a lot of people can't even afford the $1000 desktop every other year.

    ...But that really isn't the point...because there really are a lot of people who don't game, don't do CAD, and don't even mind a 15" screen. They just want a basic system, and the only reason they upgrade their 3-year-old single-core laptop is because it needs a format/reinstall and paying someone to do it would cost almost as much as the $250 replacement they are going to buy. When these people start buying the next Celeron with on-chip nVidia Cuda, we get to see Cuda in tons of apps, and then those apps are blistering fast on our desktops that have high-end nVidia cards.
    I just had a way off topic thought on your comment. Concerning market saturation and how it ultimately affects the economy as well. How the everyday person or noob that will not spend the time/money to have someone format/reinstall but purchase a cheap new one to play with. When those cheap new ones come with the architecture to support better graphics and applications. It will ultimately "trickle up" the higher end/desktops. Which is also a much better economic model than the current one we have.

    18.12.2012 21:37 #14

  • KillerBug

    Originally posted by defgod: I just had a way off topic thought on your comment. Concerning market saturation and how it ultimately affects the economy as well. How the everyday person or noob that will not spend the time/money to have someone format/reinstall but purchase a cheap new one to play with. When those cheap new ones come with the architecture to support better graphics and applications. It will ultimately "trickle up" the higher end/desktops. Which is also a much better economic model than the current one we have. We have already seen it to a certain extent, we now see everyday apps using things that were once just for gamers, things like MMX instructions that are one every Intel and AMD chip regardless of price. Heck, we have 64 bit apps that don't come close to needing 4GB under normal uses just because you can't find a new x86 laptop anymore. Even when the laptop comes with 2GB memory and a x86 OS, it has an x64 chip.


    19.12.2012 08:27 #15

  • ivymike

    Intel would be better off acquiring VIA or ARM Holdings.

    19.12.2012 14:28 #16

  • Mr-Movies

    Originally posted by LordRuss: Originally posted by KillerBug: ...if Intel chips were not so much better than AMD chips, I'd say this was leveling the playing field. I have always had a tit for tat, pseudo 'real world' usage argument over this statement for eons... not exactly worth getting into again seeing as there are so many fences to hang a hypothesis on...

    But play it safe to say that Nvidia is such a graphics force unto itself that there is no doubt they needn't pair with anyone.

    Intel (like AMD) has hit the operating threshold. So making a processor more 'capable' is what's going to bring your customers running. Hyperthreading just doesn't cut it for me. I'd rather have real cores. However, onboard video assist seems silly, other than the laptop/netbook market. That just sounds like we're getting back to making the RISC processor.

    I'm just still wondering how Intel can release a flagship processor for a G note with a straight face that Tom's Hardware is showing it to only marginally perform around the newest AMD Bulldozer at less that a third the cost.

    Tweak the software so my AMD system will perform to the Intel specs, conspiracy be damned. I'll wait the 1.5 more minutes for the movie to compress rather than spend 1K more on an Intel computer.
    I agree to with exception to compressed video, I think AMD does a better job in that market because you have more cores and even at 70% the clock cycle the more cores performs better/faster for me then on an Intel based system which has less cores.

    21.12.2012 08:22 #17

  • Rack17

    Interesting Microsoft Owns any sale rights of Nvidia.

    21.12.2012 11:03 #18

  • mrted

    Originally posted by Rack17: Interesting Microsoft Owns any sale rights of Nvidia. @Rack17...I'm guessing you are referring to this excerpt from an article I found on Google:
    " Microsoft has held a matching right Nvidia since 2000. Apparently has the ultimate takeover defense in place. Here's the description of the right from Nvidia's recent 10-Q:
    On March 5, 2000, we entered into an agreement with Microsoft in which we agreed to develop and sell graphics chips and to license certain technology to Microsoft and its licensees for use in the Xbox. Under the agreement, if an individual or corporation makes an offer to purchase shares equal to or greater than 30% of the outstanding shares of our common stock, Microsoft may have first and last rights of refusal to purchase the stock. The Microsoft provision and the other factors listed above could also delay or prevent a change in control of NVIDIA."

    Interesting ramifications...

    21.12.2012 20:25 #19

  • kutulu1

    Originally posted by bobiroc: Originally posted by LordRuss: Originally posted by KillerBug: ...if Intel chips were not so much better than AMD chips, I'd say this was leveling the playing field. I have always had a tit for tat, pseudo 'real world' usage argument over this statement for eons... not exactly worth getting into again seeing as there are so many fences to hang a hypothesis on...

    But play it safe to say that Nvidia is such a graphics force unto itself that there is no doubt they needn't pair with anyone.

    Intel (like AMD) has hit the operating threshold. So making a processor more 'capable' is what's going to bring your customers running. Hyperthreading just doesn't cut it for me. I'd rather have real cores. However, onboard video assist seems silly, other than the laptop/netbook market. That just sounds like we're getting back to making the RISC processor.

    I'm just still wondering how Intel can release a flagship processor for a G note with a straight face that Tom's Hardware is showing it to only marginally perform around the newest AMD Bulldozer at less that a third the cost.

    Tweak the software so my AMD system will perform to the Intel specs, conspiracy be damned. I'll wait the 1.5 more minutes for the movie to compress rather than spend 1K more on an Intel computer.
    100% Agree. When it comes to real world performance AMD performs very well with most benchmarks skewing performance in one way or another. Sure Intel will win a few seconds here and a minute there on some benchmarks but I find if you compare processors at their price point between Intel and AMD you will usually find similar performance. You also have to consider more than the processor because often comparable quality and featured motherboards that take Intel chips are more expensive by as much as 50% which factors into the cost of the system. If you are building for the average user that only needs something with integrated graphics and not playing games I find that AMD APUs offer great value and very good performance.
    I was an avid Intel user until I decided to try the new AMD 8-core processors; have no regrets. I can transcode blurays in 4-5hrs and dvds in 25 minutes. I can play Witcher 2 with the highest settings without a stutter. All of this for 2/3rds the cost of a comparable sysem using an I7 proceesor:
    AMD FX-8120@3.1GHz 16MB cache
    Corsair Vengeance 1600MHz Ram 16GB
    Sapphire Radeon 6970 Video Card 2GB

    23.12.2012 04:13 #20

© 2024 AfterDawn Oy

Hosted by
Powered by UpCloud