►RX 7600 (Amazon Affiliate):
►RX 7600 (Newegg Affiliate):
►RTX 4060 Ti (Amazon Affiliate):
►RTX 4060 Ti (Newegg Affiliate):

RX 7600 and RTX 4060 Ti just got benchmarks, Intel’s about to change PCs forever, here’s AMD’s RX 7600, Intel’s 14th Gen has a new CPU core and the GPU market is worse than I thought! Stay tuned…

****Items featured in this video available here****
Newegg (Affiliate):
Amazon US (Affiliate):

Join The Discord:

Facebook: Facebook:
Gamer Meld Merch:

Gamer Meld Sponsors:
Support Me On Patreon:


0:00 Intro
0:32 GPU Market Is In Trouble
2:25 Intel’s 14th Gen Brings 3 Core Types
4:14 This Is AMD’s New RX 7600
5:12 RX 7600 And RTX 4060 Ti Performance
7:29 Intel’s About To Change PCs Forever


This Post Has 29 Comments

  1. ehsnils

    The downside of the removal of 16 and 32 is that there's going to be trouble for those that uses legacy equipment requiring 32 or even 16 bit support.
    At my workplace there are some equipment that even though it's aged would cost 7 figures to replace, and that's one of the headaches added, you can't just upgrade the controlling computer.

  2. Azox

    no one buying 4070 is a rly good hit for nvidia for doing this prices on the GPU and so it knows framegeneration is not enough as sell point

  3. Ryan T

    Nvidia messed up with the 4070, the price-to-performance metric does not hold up compared to the 4070ti.

  4. Son Goku

    @Gamer Meld
    show us your zamasu shirt, whered you get it?

  5. ronch550

    Back in 2021 I bought a 1050 Ti because I didn't want to pay too much for overpriced GPUs back then. I thought I'd buy something better when prices came back down, but after using it for 2 years I haven't found much reason to upgrade. Sure, I only game at 720p but on a 24" monitor 720p looks just fine. I don't play the most demanding titles either so the 1050 Ti works great and the power consumption is pretty low too.

  6. Valen Ron

    Intel has historical attempts to kill X86 16-bit and 32-bit i.e.
    Pentium Pro's 16-bit degraded performance.
    Itanium's IA-32 (X86 32bit) degraded performance. Intel's desired 64-bit CPU path.
    i860 (non-X86 32-bit CPU).
    i960 (non-X86 32-bit CPU).
    iAPX 432 (non-X86 32-bit CPU).

  7. Soulbytes

    Yay its time to buye 2nd hand 30 nvidia series or 6series amd

  8. John Doh

    The statement that the 4070 isn't selling is garbage. Maybe in SOME markets it's not, but in the US, you can look at Amazon's top 50 best selling GPUs. A week ago 1/5 of that list was an AIB 4070. Between it and the 4070 Ti they dominate that top 50 list, and if you subtract out AMD GPUs and look ONLY at Nvidia GPUs, that is about 40% of Nvidia GPU sales right now.

    There are two things that are true. AMD at one time had about 5 GPUs in this list, which was a TERRIBLE showing. I think because of how low their pricing has been now for a long period of time AMD GPUs have started to make inroads. There are now 16 AMD GPUs in Amazon's top 50 list, so FINALLY PEOPLE ARE LOOKING AT VALUE!!!!!!!!

    But there are more 4070s than 4070 Ti in Amazon's top 50 best selling GPUs. So, ALL THE YOUTUBERS WHO KEEP SAYING THAT THE SALES ARE BAD ARE WRONG. You're just WRONG. In fact right now the 4070 is probably one of the best selling Lovelace GPUs at least on Amazon.

    So you're WRONG. The DATA disagrees.

    I don't think Nvidia wants to SELL it. I think their profit margin is lower on this product and because Youtubers have been saying it's not selling and that it's crap it gives Nvidia an excuse to cut back production. They'd rather sell the more expensive 4070 Ti and frankly the more expensive 4060 Ti based on being a smaller die, and the cost of TSMC die is OUTRAGEOUS for these top end nodes.

    So no, sales aren't going to make a decision for NGREEDIA, it's going to be profit margin.

    Thanks stupid ass Youtubers for being so critical over an Nvidia GPU that ACTUALLY MAKES SENSE!

    You all are so focused on CUDA cores that you forget about tensor and RT cores and considering RT IS BECOMING MORE PREVALENT and at SOME POINT will be the dominant way lighting is done (including UE5's lumen which is still RT), the 4070 is actually a good GPU. It's balanced. It's $200 USD less than a 4070 Ti.

    When you drop down to a 4060 Ti the amount of RT cores goes to CRAP so you're basically paying for the CUDA cores, whereas on the 4070 you can USE RT settings.

    Really, Youtubers should STOP being so negative towards products. If you do product reviews, sure, say your words about price/performance but please STOP all the bashing. The die size for the 4070 is the same as the 4070 Ti and TSMC typically has an excellent defect rate, so WHY IS IT you think they'd reduce the number of 4070s? It's nGREEDia.

  9. You still have lots of games that will fail to start without baked in virtualization. Not the worst idea in the world but.. we will start seeing more compatibility issues then before. Virtualization is probably inevitable as we will make the switch to RISC CPUs.

  10. John Doh

    Dude, Intel 14th gen is a mix of different architectures.

    Why is this so newsworthy? These parts are going into laptops, not desktop, that is if you're talking about Meteor Lake. Meteor Lake needs the Intel 4 process node, and for desktop Intel has been talking about skipping Meteor Lake, releasing a Raptor Lake refresh for desktop 14th gen and then going to Arrow Lake for 15th gen.

    Intel has already talked a lot about Meteor Lake and how it can take die of various types, from different manufacturers, etc….. to piece together a CPU. Yes that's cool. It will be cooler when it ACTUALLY releases to desktop. So for the laptop market yes it could be neat that in low power situations that you shut down the main core tile and use 2 low power cores in whatever tile Intel puts it in. Calling it an SoC tile is confusing though because SoC typically refers to the ENTIRE package. It means you bring in multiple functionality into a package which is what Intel and AMD have both been doing for a few years now.

    AMD has been talking about powering down cores for laptop so I imagine they'll basically accomplish the same thing but in a different way.

  11. no name

    intel should focus on only 64bit base compute, and try to add 128bit or 256bit compute for their processors

  12. John Doh

    Dude there is a LOT of 32 bit code out there that people use all the time. Most major companies have moved to 64 bit code, but there's a LOT of 32 bit code and their developers may not plan to move to 64 bit.

    And when Intel talks about THEIR development of a 64 bit ISA, what are they talking about? They have a 64 bit ISA for server CPUs, but not the CPUs that run X86-64. AMD developed the 64 bit ISA that's being used for X86-64.

    So what does Intel REALLY mean? Sure I can see them wanting to smaller cores and getting rid of the 32 bit ISA, but if they do that whose 64 bit ISA are they going to use? Are they once again going to try to push THEIR ISA, creating 2 different ISAs that have to be supported for PC?

  13. JTMC93

    I just hope the RX 7600 keeps the USB-C connection… It would be valuable for a lower cost creative PC for art and such. (A lot of modern pen displays can use a single USB-C for Power, Data, and Video.)

  14. Zelara

    I am glad we're moving towards removing legacy code from processors. I was most thrilled by that bit of news the most.

  15. Zeek

    If sales are so bad, stop trying to sell the GPUs at the markup from the crypto boom.
    Demand wants normal prices, not $1k for midrange.

  16. Timosan96

    Wait you upload the Video without sounds of speaking

  17. Vens Roofcat

    My best bet Intel discarding 16bit compatibility will shorten startup time by a whole 0.7234 sec. Would love to be corrected, but this sounds like some really simple checks.
    Other than that there is ton of legacy stuff in all computer related things which we keep carrying with us.

  18. Maradnus

    We on the RTX 6000 already? We had the 5000 last week

  19. aowi7280

    Not going to buy a gpu that lame for that much money.

  20. Pie Studios

    As a content creator who's been burned one too many times by Nvidia, the RX 7600 is a VERY interesting prospect. Performance that beats my 5700 XT with AV1 encoding that smashes AMD's garbage H264 encoder. I just hope it's priced reasonably.

  21. kees garfin

    Of course…discontinue and say no one wants it so that they can keep the narrative that their pricing is correct and that they can ask more in the future rather than actually adjusting the price to what it should be

  22. Mike B

    Those benchmarks say a lot about GPU architecture… The 7600 & 4060Ti have the same specs (everywhere) except for shader count.

  23. Skias Abhorsen

    This gonna be bunny if people fonna show them midel fimger and not gonna buy this junk for this cresy price 😂😂😂

  24. fl00d

    Ohhh so this is the face that does that godawful voice….

  25. Eric Tay

    Intel x86 Instructions set is actually decoded into simpler instructions and execute on RISC based silicon. I'm not sure if AMD does this since I've never studied their microarchitecture.

    Removing 16/32bit instructions probably will not remove much silicon because the ALU is a 64-bit RISC core anyway. The predicator silicon may be simplified I guess, unless that's also a RISC based controller.

    The main advantage is reduction in complexity in microcode and boot-up security. Removing legacy modes reduce malicious code's attack vectors especially those newer Spectre code that target cache predication.

    Modern CPU can easily crack 16/32bit crypto so moving to pure 64bit helps increase randomness of the number generator and prevent the use of small crypto keys. This also helps the move to a 128bit architecture. I know most math operations use 128bit and larger datasets in SSE/AVX etc., but as we gather more data and need to process them, eventually we'll move to a 128bit ALU.

  26. Dan Archer

    if intel goes straight 64 bit, how much of the Linux systems will have to be re-written that is out there?

Leave a Reply