-
Posts
2331 -
Joined
-
Last visited
-
Days Won
52
jaxa last won the day on November 23
jaxa had the most liked content!
Reputation
962 LegendaryAbout jaxa
- Birthday 04/01/1982
Contact Methods
-
Website URL
http://t4lg.anodal.org/
Recent Profile Visitors
63795 profile views
-
I'll probably try Thief 3 for the first time in forever after snobel finishes making a Linux version of Sneaky Upgrade. https://www.ttlg.com/forums/showthread.php?t=138607&page=82 https://www.ttlg.com/forums/showthread.php?t=153010
-
jaxa started following What Linux version are you using?
-
Zorin OS
-
Warhammer: Vermintide 2 100% off on Steam https://slickdeals.net/f/18848197-warhammer-vermintide-2-pc-digital-download-free https://store.steampowered.com/app/552500/Warhammer_Vermintide_2/ Was previously free, you may already have it.
-
There's a lot of talk about the bubble bursting imminently. Be careful. As for an RTX 6050 hitting $300, I'll note that the RTX 5050 is the same MSRP of the RTX 3050: $250. What sucked about it was that it's generally a worse card than the RTX 4060. Some of the blame lies with both 40-series and 50-series using the same node, but the design of the GB207 die in the RTX 5050 just kinda sucks. 5060 Ti (8 GB) uses a ~22% larger die to get ~40% more performance at 1080p, while the 5060 cut down from the same die still gets ~26.5% more performance. Source: https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5050-review/4 Even if 60-series pricing sucks, it will look a lot better from being on a new node. It's also likely to utilize a lot of 3 GB GDDR7, including what may be the first 128-bit 12 GB cards. The delayed/cancelled 50 Super series was never even rumored to have a 5060 12 GB, just 5070 S 18 GB, 5070 Ti S 24 GB, and 5080 S 24 GB. RTX 3050 6 GB remains one of the superior 75W options that is sold directly to consumers, which is SAD.
-
Those things are probably more expensive than silver by weight now. Treasure it.
-
I thought you already owned it since you kept mentioning it. I don't see why you'd need to upgrade from that for another 7+ years. Unless it breaks or something. https://videocardz.com/newz/adata-and-msi-to-launch-the-worlds-first-4-rank-ddr5-cudimm-memory-128gb-per-module-on-z890-platform
-
I posted about it on the previous page. Many people are focusing on the 8 GB VRAM, but 8 GB can and will be stretched to work. It's becoming ever more likely that AMD will officially release a version of FSR4 on RDNA3. Pricing for Steam Machine will be fascinating.
-
This was with Zorin OS 17 (18 is out). As far as weird issues go, the hardware is probably a big factor. An ancient Intel CPU w/ iGPU like what I'm using is going to be rock solid in most cases. FSR Redstone kind of exists now: https://videocardz.com/newz/amd-launches-first-feature-from-fsr-redstone-with-call-of-duty-black-ops-7-only-for-rx-9000-gpus
-
Three games at once on Epic this week: https://store.epicgames.com/en-US/p/scourgebringer https://store.epicgames.com/en-US/p/songs-of-silence-778d86 https://store.epicgames.com/en-US/p/zero-hour-8449a0
-
I mostly figured out Lutris myself, just this past week. I installed a pirated old game using Zorin OS's built-in WINE capability for doing that, ran it and got to the main menu, but crashed when trying to play. So I added the already installed game to Lutris, selected Proton instead of WINE, and that worked.
-
The accidentally released FSR4 code uses INT8 to run on RDNA2/3, instead of FP8 for the version that has been officially released for RDNA4. https://videocardz.com/newz/gamers-run-fsr-4-on-rdna2-gpus-better-image-quality-but-10-20-lower-performance
-
RDNA5 could double raytracing per CU (not sure if we're measuring "path tracing" separately but that will be improved), and will probably be the architecture used in PS6 and Xbox Next in order to deliver usable raytracing performance. It's pretty likely that FSR4 will be ported to RDNA3. AMD already "leaked" it in a form that people were able to make work on RDNA2, but AMD is unlikely to offer official support for FSR4 on RDNA2. FSR "Redstone" should improve raytracing/path tracing. We might hear about that tomorrow during AMD's 2025 Financial Analyst Day, or at CES, or later.
-
They would rather sell the same silicon for $10,000+ than $1,000+. I don't blame them. You will not get the priority treatment you want. But they need a fallback if the bubble pops. I will say that I'm surprised we haven't seen more persistent use of budget node products. It's not like demand for allocation on older nodes completely evaporates when the new ones come online, but what sense does it make for AMD to stop making Navi 33 (RX 7600 XT through RX 7400), for example? Just keep making that for years, under the same names to avoid the Rebrandeon curse. X-ray or some kind of particle acceleration should be next, after EUV iterative upgrades. A company called Substrate recently announced it would take on ASML, but it's being called a scam. I did see the chatter about RTX 50 Super delays/cancellation because of rising GDDR7 costs. I guess the memory market is more of a disaster than I anticipated. You hit on a lot of TPU points. But AFAIK the top accelerators Nvidia and AMD make are purely AI-focused, not dual-use, with no display outputs. They do make workstation class GPUs that are dual-use but generally focused on AI or whatever applications can exploit high VRAM (e.g. 48 GB). Consumer gaming GPUs are dual-use because AI accelerators and format support have been added to them, and now AI acceleration is actually used in gaming, mainly for upscaling. While all these companies are competing for each other, there's only a finite amount of wafers to go around. The best you can hope for is that someone improves TOPS/Watt in such a revolutionary way, that a surge in performance outpaces demand for that performance. But then Jevons paradox comes to mind. It's plausible that some new approach using optical/analogue computing or something could leave the competition in the dust. If it's still using the same nodes/wafers, it might not matter. It also doesn't solve memory demand problems unless they revolutionize memory usage with some weird new processing-in-memory, lower precision, or something. I recommend the black pill of lowering resolution and settings, or playing older games. PS4 and earlier era games are obviously easy to run compared to the latest games, although some new games will scale well. iGPUs from a decade ago will play a lot (embrace the TDM 720p30 experience!). The world's most popular multiplayer games generally run well on old hardware, to ensure a large player base. Gaming on a budget is virtuous. You need to save up money for food and munitions. https://www.youtube.com/@BudgetBuildsOfficial/videos 7650 GRE is a lot slower than a 9060 XT 16 GB. Of course, if it's a pricing decision, you would know more about what makes sense. I don't follow these markets. There was a 7500X3D leak recently. Budget X3D is cool, although probably unnecessary if the CPU is not a bottleneck. We are likely to see large increases in L3 cache (X3D and standard) during the Zen 6 and Zen 7 generations, both of which should be on AM5. So that's exciting. We'll be able to find out which games really benefit from over 96 MiB (Dwarf Fortress is a safe bet).
-
Every few years there's a DRAM pricing catastrophe, or price fixing. This one is particularly potent because the end of DDR4 production drove prices up right at the time that more DDR4 PCs are entering the used market with Windows 10 end-of-life. Want 64 GB DDR4 SODIMM? It's no less than $200, probably more. GPU concerns are probably overblown. While they are not as cheap as we'd want them, 9060 XT 16 GB or 5060 Ti 16 GB at/below MSRP aren't too bad and should handle the latest games. Recent iGPUs and much older dGPUs are sufficient for decades of older games, and of course The Dark Mod. Low-end, compact GPUs could use a kick in the rear, with 3050 6 GB still being one of the better 75W/LP options. Maybe an Intel B380 will materialize, maybe not. AMD's RX 7400 is OEM only for now so not relevant yet. I am optimistic about RDNA5 low-end but it's a long wait. @taffernicus What exactly are you looking for TPUs to bring to the table? Displace gaming dGPUs from being used for AI by being cheap/efficient? The great thing is they just work (at least on Nvidia's end) and are dual-use. Well before 6-8 years we are likely to see the bubble pop, if AI is not generating enough revenue. Do not count out a next-generation Xbox... the rumor mill has it effectively pivoting to become an APU gaming PC with access to all storefronts (Steam, Epic, GOG, etc) and Windows applications. So it could undercut DIY and OEM PCs as well as PS6 + game prices + misc costs. Not expected before 2027 so we should know if that's the strategy within about 2 years.