Jump to content
The Dark Mod Forums

MoroseTroll

Member
  • Posts

    306
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by MoroseTroll

  1. Kurshok: I would suggest you to find an offer with both games. I'm not sure, but I hope that some e-stores might still have pre-order keys for Dishonored 2 with Dishonored 1 GOTY/Definitive Edition as a bonus.
  2. If we're talking about the acquisition of PowerVR, I'd imagine that it would be Intel. These two were already co-operating once or twice (who remember those slo-o-ow GMA 500/600/3600/3650?), so maybe Intel would decide to enforce its own GPU team by the PowerVR specialists. About offloading FPU work to the iGPU: I think you, guys, should better find a good Vulkan programmer (Axel Gneiting, Tiago Sousa?), instead of waiting for such a miracle .
  3. Every x86 extension (MMX, SSE, AVX, FMA, etc.) just extends the x86 ISA, not replaces it. The "xor ax, ax" instruction still has the same byte codes (0x33, 0xC0), as it was in 1978, when 8086 was born. It's a binary compatibility. If AMD or Intel would implement, say, GVX (let's name it Graphics Vector eXtension ), then all other CPU/GPU vendors should do the same, in order to gain advantage. But AMD would never do that (remember 3DNow!, E3DNow!, XOP, FMA4? Many modern AMD CPUs don't support them anymore due to very serious reasons), nVidia cannot do that (because it hasn't its own x86 CPU), VIA is almost dead, so the only one who's left is Intel. Would Intel implement so named GVX? I don't know, but I doubt. You see, Intel still didn't implement AVX-512 in its desktop and laptop CPUs, whereas that extension has been announced in 2013, i.e. 4 (four!) years ago. So why should Intel bother about GVX?
  4. As long as there is a bus between the CPU and GPU (even the integrated one), no major CPU vendors would put their efforts into offloading the FPU directly via the x86-code. Even the fastest buses have a multi-thousand clock latency, so it's a no go. But even if there was no bus between the CPU and GPU (i.e. they are fully integrated into single die), there should be an ISA that programmers would use to program those GPUs. And since every new GPU generation even from the single vendor has a different ISA, I would barely imagine how different GPU vendors would standardize such an ISA for decades to come.
  5. I suppose, MS, and motherboard vendors just need some time to tune up Windows, and UEFIs, respectively. AFAIK, there is something wrong within the task manager in Win10 when it works on Ryzen, whereas everything is fine in Win7 (not sure about Win8/8.1). And yes, AMD have skimped on the SIMD unit in Ryzen, but only in the AVX-256 bit mode, i.e. the SSE- and AVX-128 modes both work very fast. About the offloading the FPU to the iGPU: as far as the code uses FPU instructions (x87, MMX, SSE, AVX, FMA, etc.), this will never happen. Anyone who wants to offload the FPU in the (i/d)GPU, should use special languages and/or libraries: OpenCL, DirectCompute, Vulkan, Direct3D 12, GNM/GNMX with PSSL, etc. About the Ryzen-based APUs: since they will contain not more than 4 cores (i.e. just one core complex (CCX), not two, as in Ryzen 7), there will be no L3<->L3 traffic during the task migration between the the CCXs (Ryzen 7 has two of them). Plus, these APU will boast a higher CPU frequency (at least 4 GHz in the high-end models, probably even higher), so I personally wouldn't say "Meh" about these APUs . About evangelizing game developers: it's very hard to convince some of them to cooperate, so even Bethesda, with its id Tech 6, Void, and Creation engines, is very good for AMD for now . Look at Ubisoft: they just hate everything excluding Intel, and nVidia, and mind you, Ubisoft uses in almost all its games its own engines. About Epic & CryTek: since Ryzen/Ryzen2/Ryzen3 and Vega/Navi will be the foundation of the next PlayStation and XBox, I think they'll cooperate with AMD, too - when the time comes.
  6. MoroseTroll

    Vulkan

    Looks like CroTeam is very serious about Vulkan.
  7. Some Emily's features, like Domino and Mesmerize, can make your playthrough simpler and funnier . The story, AFAIK, is pretty the same, excluding comments on the environment.
  8. MoroseTroll

    Vulkan

    I wonder why Arkane is still using DX11 in its Void Engine, whereas they can call a few guys from id Software (say, Axel Gneiting and Tiago Sousa) for help to implement Vulkan? Both companies are now parts of Bethesda, so what's the problem? According to VGChartz and SteamSpy, Dishonored 1 is more a PC game (3,75 mln), than a console game (3,76 mln on four platforms: 360 + PS3 + XBO + PS4), so I think Bethesda & Arkane both should undestand how important the PC version of Dishonored 2 is, because this time, there are only two console platfroms (XBO + PS4).
  9. Guys, DH2 already was on sale with 33% off, right before TGA 2016. Did you know that? About patch 1.3: with it, you can have a much more smooth and fast fps, so please don't doubt and buy the game - it's almost perfect.
  10. Nope, DH2 uses a completely different engine, so named the Void Engine, an idTech 5 fork. Whereas DH1 uses Unreal Engine 3.5.
  11. I have a suspicion that the in-game antialiasing option has the most negative effect in terms of performance, so I'd suggest to turn it off.
  12. Yesterday, I've spent more than 4 hours playing intro, the queen palace, Dunwall, and the first visit to the Void, and now I'm happy (although I've collected just 2/3 of loot) ! The game looks nice, but I should mention that I've changed some default options in order to get a crisp image, like in-game screen scaling (75% -> 100%) and antialiasing (TXAA low -> none). There still are a few little graphics artifacts, but I'm sure they soon will be eliminated. The gameplay is basically the same as in DH1, but if you choose the most hard difficulty (like I did), be ready: the game won't forgive you for any mistake you make. And I like it ! There are also a lot of texts, so bookworms like me will also be happy . In a nutshell, DH2 is a great game, from I what I've seen and heard in it at this moment.
  13. Maybe that guy with PS4 was saying about the preloaded part of the game?
  14. I'm sure that the game will definitely use any hardware threads your CPU has. Why? Because the Void Engine is based on the id Tech 5, therefore we can safely enough interpolate the test results of the latest id Tech 5-based games (Wolfenstein: The New Order, Wolfenstein: The Old Blood, The Evil Within), say, from GameGPU.com, on Dishonored 2. Those tests say: yes, there is a little difference between Core i5 and i7. But how much it will be in case of Dishonored 2? Who knows. But I doubt that it will be big enough. About RAM: both Wolfensteins consume from 1.5 to 1.8 GB. How much RAM will Dishonored 2 consume? I don't know, but I doubt that it would be more than 4-5 GB. And since the id Tech 5 uses OpenGL, I suppose that there will be no DX12 mode in Dishonored 2, probably Vulkan (a.k.a. OpenGL Next) instead.
  15. Yes, I partially agree with you. I suppose, Unreal 1 can be an example of what you're saying - the game has fantastic, very different visuals, but that was the goal of Epic. I suppose, the Arkane's goal was and is a bit different, if we speak about DH1 & DH2. BTW, who knows, maybe DH2 contains some places with their own visuals that we still didn't see: say, a silver mines, or a garden with that Brigmore-ish tree(s) from one of the videos, etc.
  16. It's called "a visual style," not "a lack of imagination." AFAIK, the Dunwall's design was inspired by the Victorian architecture of British cities like London, Edinburgh, etc. And since almost all events of DH1 are happened in Dunwall and its outskirts, it's obvious that there should be some scheme in its visuals. Although I've never been in Britain, I doubt that someone can find there a Long Beach-style dawn. I should say the same about Karnaca: since almost all events of DH2 are happened in this city, it's obvious that there should be some scheme in its visuals. You want some more exotic visuals? Then welcome to the Far Cry, Crysis, Dead Island, and other series.
  17. MoroseTroll

    Vulkan

    From the point of view of game developer or publisher, it doesn't matter whether the consumer plays the games (s)he bought or not .
  18. MoroseTroll

    Vulkan

    Also, it's well-known that an average Mac user is way more richer than an average PC user. If this rule is correct in case of games, then probably 3.34% of OS X users own a share more than 3.34% .
  19. MoroseTroll

    Vulkan

    Look at Steam games - a lot of them have Mac versions (actually 8231). Which means that Mac is a relatively popular game platform. If so, it would be nice if this platform adopts an industrial standard APIs, like Vulkan, instead of a proprietary ones, like Metal. I think the less difficulties game developers have during porting their products to Mac, the better.
  20. MoroseTroll

    Vulkan

    What about Doom 2016?
  21. Maybe. But I personally would prefer to call not an evolution but a fork .
  22. Frankly, I doubt. I would imagine id's logic: "You guys had balls to take and re-work our engine, which means that you surely have balls to port it to Vulkan or whatever you want." Like I said, these are different versions, Five and Six.
  23. I've asked Raphael Colantonio via Twitter: why Arkane have chosen to use id Tech Five for DH2? He said, "proprietary tech." I think this means that the Void Engine is based on the id Tech Five, not Six. I'm sorry if I upset someone who thought that it would be Six. Does this mean that DH2 will work on Windows PC via OpenGL? Likely, but I'm not sure for 100%. Also, there is a hope that DH2 will support some day either Vulkan (in case Arkane didn't cancel the OpenGL-renderer) or DX12 (in case Arkane has ported the OpenGL-renderer into DX11).
×
×
  • Create New...