Jump to content
The Dark Mod Forums

Search the Community

Searched results for '/tags/forums/amd/' or tags 'forums/amd/q=/tags/forums/amd/&'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General Discussion
    • News & Announcements
    • The Dark Mod
    • Fan Missions
    • Off-Topic
  • Feedback and Support
    • TDM Tech Support
    • DarkRadiant Feedback and Development
    • I want to Help
  • Editing and Design
    • TDM Editors Guild
    • Art Assets
    • Music & SFX

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

  1. It's okay. Just yeah made me wonder if any of the alternate AA types could also handle sharpness in textures, so we can taclke geometry and texture aliasing in one go seeing we get both. Given the reasons mentioned to not use FXAA and other mainstream ones, I'm wondering if we may want to create our own AA post-processing shader for TDM's specific needs, given we need both something attuned for performance and various sharpness scenarios. AFAIK the Linux amdgpu driver offers no way to enable those universally: I believe some users said they can enable FXAA in their driver settings, but that's probably AMD Catalyst or Nvidia on Windows. Never tried reshade but heard about it.
  2. Of course, it is one of the reasons for the decline of online forums, since the advent of mobile phones. Forums on a mobile are a pain in the ass, but on the other hand, for certain things there are no real alternatives to forums, social networks cannot be with their sequential threads, where it is almost impossible to retrieve answers to a question that is asked. has done days ago. For devs for internal communication, the only thing offered is a collaborative app, such as System D (not to be confused with systemd). FOSS, free and anonymous registration, access further members only by invitation, full encrypted and private. https://www.system-d.org
  3. Ignoring is somewhat inadequate as you still see other members engaging in a discussion with the problematic user, and as Wellingtoncrab says such discussions displace all other content within that channel. Moderation is also imperfect as being unpleasant to engage with is not in itself banworthy, so there is nothing more to do if such people return to their old behaviour after a moderator had a talk with them, except live with it or move away. I'd be more willing to deal with it if it felt like there were more on-topic discussion, i.e. thoughts about recently played fan missions or mappers showcasing their progress, rather than a stream of consciousness about a meta topic that may or not have to do with TDM. I guess the forums already serve the desired purpose, or they just compartmentalise discussions better.
  4. Agree on FXAA indeed but it would be cool if you looked at SMAA (Enhanced subpixel morphological antialiasing) as well, is a upgraded version of MLAA (rival of FXAA) from AMD, made by Crytek and Universidad de Zaragoza. Is always subjective but many say it looks better than both FXAA and MLAA while being almost as fast. Never tried it in TDM but I know that Reshade has a SMAA shader, could be a way for you to test how it looks in comparation to FXAA ingame and later decided if is worth to write a inhouse one. And IMO no need to replace multisampling it can work along side post process AA systems very well.
  5. Not only that but IMO today with Nvidia AI frame generation (or any equivalent tech that AMD may eventually come up with), frame rate numbers lost all their meaning (at lest with frame generation turned on), with DLSS3 FG you can have a "60" fps game that can feel like a or less than a 30 fps game ( on input lag and if that is the base frame rate from where DLSS AI is working from). High frame rate numbers are a good way to sell GPU's ...
  6. I don't recall a system for noise masking. It sounds like it'd be a good idea, but when you get into the details you realize it'd be complicated to implement. It's not only noise that that goes into it, I think. E.g., a high register can cut through even a loud but low register rumble. And it's not like the .wav file even has data on the register of what it's playing. So either you have to add meta-data (which is insane), or you have to have a system to literally check pitch on the .wav data and paramaterize it in time to know when it's going to cut through what other parameters from other sounds. For that matter, it doesn't even have the data on the loudness either, so you'd have to get that off the file too and time the peaks with the "simultaneous" moment at arbitrary places in every other sound file correctly. And then position is going to matter independently for each AI. So it's not like you can have one computation that works the same for all AI. You'd have to compute the masking level for each one, and then you get into the expense you're mentioning. I know there was a long discussion about it in the internal forums, and probably on the public subforums too, but it's been so long ago now I can't even remember the gist of them. Anyway the main issue is I don't know if you'll find a champion that wants to work on it. But if you're really curious to see how it might work, you could always try your hand at coding & implementing it. Nothing beats a good demo to test an idea in action. And there's no better way to learn how to code than a little project like that. I always encourage people to try to implement an idea they have, whether or not it may be a good idea, just because it shows the power of an open source game. We fans can try anything we want and see if it works!
  7. kano

    2016+ CPU/GPU News

    Not hardware related, but I upgraded to Debian 12 and I swear the system feels faster and more responsive now. Quite a bit, actually. I know for sure that the AMD drivers were improved between Linux 5.10 and 6.1. But it really feels like other optimizations and improvements were made as well. And it's not a fresh install, it's an upgrade, so there's none of that "you started with a clean slate so of course it's faster" that you get when you first install Windows and the registry hasn't gotten filled with crap yet. It's just too bad that Linux 6.2 did not make it into Debian 12 as standard, because I think that's what you need for good Intel Arc support. I was this close to buying an A770 last week, but then the price went up overnight from $329 to $400. I guess Intel saw the announcements just like we did. But I think I'm gonna just sit on current graphics hardware for as long as possible to teach the industry a lesson. EDIT to be clear the desktop is faster and more snappy not just with AMD graphics, but also NVidia as well. Also the web browser too. They must have done something to improve scheduling in the kernel. Now that more consumer devices, e.g. Steamdeck are running Linux, and not just servers, one should probably expect more improvements of this nature.
  8. jaxa

    2016+ CPU/GPU News

    https://videocardz.com/newz/amd-ryzen-5-5600x3d-6-core-cpu-with-3d-v-cache-is-reportedly-coming Rumored 5600X3D coming to AM4.
  9. I'm using the version from kcghost. I just tested and I can't see any difference inside the inventory. On the stats itself it doesn't show the different loot types (still seen in the inventory), but instead gives more info on stealth score. Edit: I see Dragofer made an updated version of his script. I have to check that out. Edit2: That version works: https://forums.thedarkmod.com/applications/core/interface/file/attachment.php?id=21272&key=02755164a3bed10498683771fe9a0453
  10. I looked but didn't see this video posted in these forums. It's pretty cool.
  11. It wasn't a "sacrifice", it was a deliberate decision. People wanted the game to be as close as possible to the original, including pixelated graphics. If you ask me, the former version based on the Unity engine looked and felt better. But, hey... I guess I'm not the right person to judge that, as I never played the original, and always found that the art style of System Shock 2 is much better anyway. This also illustrates the issue with community funded games: Too many cooks spoil the broth. In game design, you need freedom, not thousands of people who want you to do this and this and that. Just take a look at the Steam forums and see how all those wimps complain again about everything. Hopeless.
  12. So giving it none of those tags, but making the AI invisible, silent, non-solid, and on a team neutral to everyone would not work? Oh well, it was a horrible inelegant idea anyway.
  13. What I understood is that the idea of TDM was born from that it was unclear if T3 would get a level editor at the time. Source: https://web.archive.org/web/20050218173856/http://evilavatar.com/forums/showthread.php?t=268
  14. This one is really essential: https://www.ttlg.com/forums/showthread.php?t=138607 Should work fine with the GOG version.
  15. https://www.ttlg.com/forums/showthread.php?t=152224 There is a new mapping contest over on TTLG for the Thief: Deadly Shadows 20th Anniversary and the organizers were kind enough to include The Dark Mod along with all of the Thief games as an options for making a mission to submit as an entry. The deadline is a year from yesterday and the rules are pretty open. I recommend going to the original thread for the details but I will summarize here: Rules: - The mission(s) can be for Thief 1, Thief 2, Deadly Shadows or The Dark Mod. - Collaborations are allowed. - Contestants can use any custom resource they want, though TDM cannot use the Deadly Shadows resource pack. - Contestants can submit more than one mission. - Contestants can enter anonymously. - The mission(s) can be of any size. Using prefabs is allowed but the idea is this is a new mission and starting from an abandoned map or importing large areas from other maps is not allowed. Naturally this is on the honor system as we have no way of validating. Mission themes and contents: There is no requirement from a theme or story viewpoint, however contestants might consider that many players may expect or prefer missions to be celebratory of Thief: Deadly Shadows in this respect: castles, manors, museums, ruins inhabited by Pagans and the like, with a balance of magic versus technology. This is entirely up to the authors, though, to follow or not - it is just mentioned here as an FYI and, while individual voters may of course choose to vote higher or lower based on this on their own, it will not be a criteria used explicitly in voting or scoring. Deadline: May 25th, 2024 at 23:59 Pacific Time. See the TTLG thread for details on submissions and the voting process. Provided I can make the deadline I hope to participate. It would be nice to see the entire community do something together, and expressing our complicated relationship with this divisive game seems as good a pretext as any.
  16. jaxa

    2016+ CPU/GPU News

    I don't know what will happen to the RTX 4060 and 4060 Ti, but I expect the RX 7600 price to slide on down. AMD already started the slide ~48 hours pre-release, so it's not like they're confident in the product. I'd consider picking up a 16 GB 7600 XT as a replacement for my GTX 970, but I don't really need it for TDM.
  17. jaxa

    2016+ CPU/GPU News

    AMD always knows how to fumble an opportunity. But expecting them to launch a 16 GB version at $300 is just unrealistic. That would probably cannibalize sales of the conspicuously absent 7700/7800 "mid-range" cards at that price. And if it does exist, maybe it's not ready, much like the 4060 Ti 16 GB is launching months later. By waiting, they can sell off more RDNA2 cards, so that those are out of the mix by the time they launch the rest of the RDNA3 lineup. As we usually see, prices will migrate to below MSRP after a while. So the $270 RX 7600 could become $220 or whatever.
  18. kano

    2016+ CPU/GPU News

    AMD isn't sandbagging the market and trolling customers as hard as NVidia, but if they were smart, they would have just kept the higher price of $300 and included 16GB of memory. This would have been an easy and epic KO, perhaps along with an advertising campaign about being for gamers who don't want smeared, low resolution textures. when all game development is stopped for PS4/Xbox One next year and developers have the freedom to crank quality up some more.
  19. jaxa

    2016+ CPU/GPU News

    Game developers want 16-32 GB VRAM to work with now. They can't always count on these other features, especially on PCs. Having a large amount of VRAM is the easiest solution for reducing latency. Let me counter your DirectStorage idea with another idea: put an SSD inside consumer GPUs, i.e. the "SSG" concept that AMD launched to the professional market a few years ago. Putting a tiny 512 GB SSD inside graphics cards would be relatively cheap, hold the entirety of any video game, and the GPU could still work if it breaks. Just cache the entire game you're playing into the GPU (or everything the GPU wants). Everyone will ultimately want a slab of APU that contains 3D memory-storage, with the CPU, GPU, RAM, accelerators, etc. all together to limit latency. https://wccftech.com/amd-radeon-rx-7600-269-usd-launch-navi-33-8-gb-tackles-nvidia-rtx-4060-1080p/ $270 is on the low end of RX 7600 price estimates (some people wanted $250 obviously). That's not bad for something that is presumably faster than an RX 6650 XT. There will be some complaints about price/performance not moving forward, but prices can drop below MSRP at some point. At this moment though, something like this 6750 XT for $320 soundly defeats the RX 7600, while supplies last. The RX 6700 10 GB is also a good option. If AMD later tosses a RX 7600 XT 16 GB onto the market for $350, then they won the "low-end GPU market" this generation. Edit: AMD did a last minute price drop on the RX 7600, $300 to $270. That 10% is enough to change it from kinda bad to kinda OK.
  20. I just don't get it. While having more VRAM on the GPU is an easy solution to improve performance and resource issues, GPU architects have known for ages that the best solution is using some sort of hierarchy where you page out to system RAM once you are low on GPU RAM. AGP Aperture was designed around this idea. When PCIE came out, it was meant to have a flexible memory aperture natively but somehow IHV's never optimized that? Then Nvidia created PCIE "TurboCache" and AMD created HyperMemory. Issue solved? Apparently not? id Software pioneered the use of virtual textures in id Tech 5 and that became a standard graphic feature called "Partially Resident Textures" How in the world is a graphic shader hack more effective than hardware solutions like TurboCache? Anyway, it works but it seems to be cumbersome to implement so it hasn't caught on I guess. Then the AMD Vega architecture launched and AMD began touting how great it's Cache Hierarchy system worked? As if the same idea was not in play per all the above previous examples? Also when Vega launched, they made a branded version of "Resizable BAR" which conceptually is the same idea as PCIE's native flexible aperture but apparently even though we made a flexible specification somewhere in the bowels of the graphic hardware specification we still had harded coded limits that needed to be fixed? Then PS5 console proposed Direct Storage which also has a graphics memory hierarchy design. We have all these tools to use system ram and even SSD \ NVMe for video memory yet we are still designing game engines to front-load all data into VRAM and acting like it's "impossible" to have good performance when dedicated video RAM is below X amount. This should be a solved problem by now.
  21. jaxa

    2016+ CPU/GPU News

    "There are no bad products, only bad prices." Which are not known for RX 7600 (XT) yet. Though it doesn't sound like it will be a fantastic MSRP going by the rumors. GDDR6 costs have fallen from pandemic/shortage highs, but they are different from DRAM and other types. We're all still waiting for cheap HBM, and will be waiting forever potentially. 8 GB of VRAM is going to be enough for many 1080p gamers. Remember that 4-6 GB cards (AMD 6500 XT and Arc A380) are still in the market. Which reminds me, leaks have pointed towards a possible 7500 XT based on the same Navi 33 die instead of a Navi 34, so maybe 8 GB can reach an even lower price point. It's possible that both AMD and Nvidia have plans to leave the low-end GPU market high and dry forever, but for different reasons. AMD is making powerful APUs that will be enough for 1080p gaming if/when they come to the AM5 desktop socket. If Rembrandt and Phoenix aren't enough performance for consistent 1080p60, Strix Point should be. Many people will be able to forgo buying any kind of discrete graphics card in the near future. Nvidia on the other hand is more of an AI company now. Gamers don't bring in the big bucks anymore, although they can be milked effectively with flagships like the RTX 4090 and "mid-tier" cards busting past $700. Based on the waste products of the professional market, of course.
  22. considering the beastly performance of todays cards it had to drop at some point, not to mention getting a case where that 4090 ti will fit i also need a nuclear power plant to drive it hehe. it is quite funny AMD was allways known in the past for power hungry toasters when it came to gfx cards but no AMD card today holds a candle to the 4090 in power draw not even the R9 295 Xt which had a rather ( blow your fuses ) attitude the 4090 can go up to 800 watt power draw!!! that is insane and just shows how powerfull a card has to be to drive an 8K monitor. but here is something rather scary the AMD rx 7900 xt can match it if you remove the restrictions on power draw ( a guy tried that out recently ) with a hacked driver. the card needed a tripple fan liquid cooling system but that was all and it stayed at 45" while trading blows with the 4090 ti holy sh..
  23. kano

    2016+ CPU/GPU News

    https://wccftech.com/amd-radeon-rx-7600-8-gb-graphics-card-specs-leak-6nm-navi-33-xl-gpu-2048-cores-8-gb-vram/ So yeah, NVidia and AMD have both decided to completely stall progress in the mainstream GPU market, similar to the state the CPU market was in a decade ago when Bulldozer came out. I guess I won't need to get a new graphics card for a long time. Finally I remind you that memory is cheaper right now than it has probably been in a decade. Hopefully the gaming community is astute enough to collectively say "no" to these products.
  24. jaxa

    2016+ CPU/GPU News

    >Processor: AMD Ryzen 7 2700X Eight-Core Processor (16 CPUs), ~3.7GHz; Intel i5 7000 series Intel i5 7000 series = 4-core, 4-thread Kaby Lake CPUs. So no, your 5600X or whatever will be just fine. These companies can write whatever they want as minimum/recommended specs and it doesn't have to make any sense. But you've also misinterpreted it.
  25. jaxa

    2016+ CPU/GPU News

    I use a 4* GB card. It's enough for TDM as far as I can tell. At least we can be pretty sure now that AMD will make a 16 GB variant of the 7600 XT. $350? * https://www.pcworld.com/article/415858/nvidia-agrees-to-geforce-gtx-970-false-advertising-settlement-offers-30-refunds.html
×
×
  • Create New...