So I game on my laptop which has a... really bad CPU (Intel core i5-4210 CPU @1.70GHz, and my device manager shows four of them... for some reason, I am not good with processers so if someone could explain that, it might add to the post's value, if there is any.) and a passable GPU (AMD Radeon R7 M260. it's passable as it runs everything I want to play on lowest settings).
Now, as would be expected for a laptop under the system specs, the game did not run very well (no fps measurement was made, but I would guess it would be between 5 and 12). After updating my GPU's driver and turning all the video options in game to minimum, it still wasn't playable and I was about to give up hope when, on a hunch, I turned on my shader cache. I played it all day yesterday, and seemed to be having equal performance to the footage I had seen of the game, albeit on minimum spec. After a framerate check I discovered I was in fact playing at 17 fps on the char select screen, 12 fps in dungeons and on the ship and about 52 fps in menus. On a whim I then turned off my shader cache, the game immediately slowed to a crawl, and here's the curious part: The framerate was still the same (except in the menus) but the game chugged and there was suddenly a huge input delay, this went away when I turned the shader cache back on.
Does anyone have any idea why this happens? And why I find 12 FPS perfectly playable for this game? Am I just too much of a low-power pleb to notice framerate? I am actually curious and would appreciate any input you guys can give.
UPDATE: Further research reveals my processor to be able to reach clock speed of 2.4GHz max. So less bad than I thought.