blackest wrote:
Some Apple laptops have 2 gpu's too. I think it's one built into the cpu and a second external gpu. The intention is to use the external gpu when the situation can take advantage of it. Using the internal gpu is more power efficient.
I wonder if it can use both at the same time.
Without knowing the precise model, it isn't possible to say if this laptop has an external card or not, some versions do and some don't.
I'm going to venture out and say that if it is any name other than Intel, that it's an external card overriding the Intel on-board, at least as it's needed for graphics power. Most times, and I don't say always, other brands are added as a value added bonus with much larger RAM memory, and that is as an external card.
For example, my Dell desktop has Intel on-board but it also had a humongous big, half pound in weight, ATI Radeon card in it that was over a foot long and 5" tall, had a case that was 3" thick (taking up two slot positions) with two fans in it that never ran in all the time I had it, and was equipped with only 512MB of RAM. Photoshop was sometimes having a problem with compatibility but not always. Driver updates weren't available because the PC and card are five years old. It was also not working well with screen refresh after a serious action like sharpening 1/3 of a TIFF file - probably because of the 512MB of RAM.
So I went to Best Buy, bought a flaming fast overkill NVIDEA GeForce card by PNY that had 2GB of RAM for $79. I would have bought one with 4GB but they didn't have them in-stock. I wanted to buy locally at brick-and-mortar in case there was a problem and I could take it back same day and get something different. Now this one is only 4" long and 2 1/2" tall, has no casing and no fan, and replaced that big hog with ease.
Put in the driver and apps and away it went. After it was up and running it updated itself and that was that. No more compatibility issues, the OpenGL or DL whatever is in full effect in Photoshop all the time. I didn't change my resolution or get increased resolution because I don't need any more than 1920X1080, the maximum of my monitor. I don't edit 4K video or anything. But... the monitor looks a little more crisp than previously. Very minor but just a little. No more waiting for a screen refresh after a serious edit - which was my main concern. When a stream of TIFF photos, let's say 10, are being loaded into Photoshop from ACR, they pop right in there one after another instead of pausing for a long period. I edit some HD video from time to time and my rendering times have decreased dramatically.
Oops... I forgot the point. When I had problems with the PC, let's say during an installation and it stopped for some reason, my Intel graphics chip took over and provided the low-res boot Safe Mode because no driver was loaded for the ATI Radeon external card. So that shows that some form of Intel graphics is on-board although not a very good one in my case. Intel seems to have gotten more heavily into on-board really good graphics more recently than when my PC was built though. I guess they finally recognized that they were missing the obvious built-in market for a product they could offer and make more money while doing so.
So many notebooks and such can work just fine with the Intel graphics of today unless you want some really serious graphics power for gaming or video editing and then the legendary response is to head over to ATI or GeForce because they're known for that. PC manufacturers provide that option with an external card as a competitive edge over each other. I bought a Dell all-in-one with Intel Celeron CPU for a dedicated purpose that doesn't require big processing power a couple weeks ago. It has Intel on-board graphics and does a really nice job, plays HD movies from the built-in DVD player/recorder, and has HDMI output which can be dedicated as a second monitor of a two monitor rig if I want to.