Page 5 of 6
Posted: 16 Jan 2007 13:35
by mexicoshanty
It's such a shame. 256x256 tiles contain so much more detail than 128x128 or lower. Maybe an option would be to cache the sprites at 128x128 then just pull the 256x256 sprites out of the grf files when needed at the closest zoom level?
Posted: 16 Jan 2007 19:39
by DaleStan
That there poll is missing "Fix the economy, but don't touch the sprite sizes."
Posted: 16 Jan 2007 19:45
by Aracirion
DaleStan wrote:That there poll is missing "Fix the economy, but don't touch the sprite sizes."
This poll is also about size relations more than economy ... just vote for old sizes in that case
Posted: 17 Jan 2007 03:20
by Ben_Robbins_
Why are 10,000 sprites needed for the game all at once? Is it not posible to kill off the sprites that are not on screen of nearly onscreen?, casue you only look at a fraction of the scenario at once.
I'm really not happy about the prospect of only making graphics to the original scale or double, everything is being made to 256. I'm aiming for that or nothing I'm affriad. I'm not willing to waste another 14 months of my life chasing rainbows.
Posted: 17 Jan 2007 04:26
by DaleStan
Ben_Robbins_ wrote:Why are 10,000 sprites needed for the game all at once? Is it not posible to kill off the sprites that are not on screen of nearly onscreen?, casue you only look at a fraction of the scenario at once.
It has nothing to do with keeping them all loaded into memory. You could load them from disk on demand, if you so desired, though this would be much slower. That's not the problem.
The problem stems from
1) The only way to know which sprites are required at any given time is to run all the sprite lookups. I think this should be relatively obvious, at least once you think about it. If it's not, please try to explain why you think this is not true.
2) Each sprite that the lookups can return must have a unique number, so that it can be uniquely identified.
Because of #1, #2 becomes "Each sprite that has been loaded must have ..." Prior to Peter's recent change, the sprite lookups could only return 16,384 unique values. So, even if you had a million sprites[0], you could only refer to the first sixteen thousand of them.
[0] Whether this is "loaded into memory" or "will load from disk on demand" is immaterial, provided that they are guaranteed not to disappear from the disk at some inopportune time.
Posted: 17 Jan 2007 12:55
by brupje
Ben_Robbins_ wrote:
I'm really not happy about the prospect of only making graphics to the original scale or double, everything is being made to 256. I'm aiming for that or nothing I'm affriad. I'm not willing to waste another 14 months of my life chasing rainbows.
I'm in favor of 256 also. Still by keeping the original size relations.
Posted: 17 Jan 2007 14:27
by Celestar
Well it doesn't matter how big the sprites are when they'Re created, recoding 512x512 sprites down to 128x128 is simple. The other way round it gets a tad more difficult
Celestar
Posted: 17 Jan 2007 18:28
by Aracirion
So I think the only relevant question is what sprite size is technically feasible. Can someone say this with certainty?
Posted: 17 Jan 2007 19:37
by Celestar
No not yet, that depends on how the system requirements will be adjusted, but imho 128x128 has a good chance of being default, with something like 64x64 or 256x256 being optional if your computer has more/less bang under the hood. However, this will need some discussion on IRC.
Celestar
Posted: 17 Jan 2007 22:18
by Raven
Well, simutrans can handle big tilesizes, and it consumes more resources anyway due to passenger destinations and the like (prissi, where are you?

)
So what prevents an interested tester from coding like 500 , 250 pixel objects and inserting them into simutrans, and see how it runs?. I could do that, but still keep the bad taste in my mouth from last time I suggested something in the blender thread.
Regards
Posted: 18 Jan 2007 08:07
by brupje
Raven wrote:Well, simutrans can handle big tilesizes, and it consumes more resources anyway due to passenger destinations and the like (prissi, where are you?

)
So what prevents an interested tester from coding like 500 , 250 pixel objects and inserting them into simutrans, and see how it runs?. I could do that, but still keep the bad taste in my mouth from last time I suggested something in the blender thread.
Regards
Guess nothing would really prevent you from doing so. But how would that prove it would run nicely on OpenTTD also?
I'm sorry Korenn's post offended you, but I don't see why it did.
Posted: 18 Jan 2007 08:20
by prissi
Since the images will be at most RLE or Huffman compressed in memory (otherwise they will be drawn really slow), this means the required memory for 10000 images is ~30 MB for 64, 122 MB for 128, 490 MB for 256 and 2 GB for 512 size for 24 Bit per Pixel (or 2/3 for 16Bit as used in simutrans).
The only way out would be a cache for like 1000 images and render on demand (which is done for simutrans to some degree after zooming/darkening/different player color). Just hope, you never exceed these number during zoom out.
Posted: 18 Jan 2007 08:33
by brupje
prissi wrote:Since the images will be at most RLE or Huffman compressed in memory (otherwise they will be drawn really slow), this means the required memory for 10000 images is ~30 MB for 64, 122 MB for 128, 490 MB for 256 and 2 GB for 512 size for 24 Bit per Pixel (or 2/3 for 16Bit as used in simutrans).
The only way out would be a cache for like 1000 images and render on demand (which is done for simutrans to some degree after zooming/darkening/different player color). Just hope, you never exceed these number during zoom out.
when you zoom out you won't be needing the 256px images
Posted: 18 Jan 2007 09:13
by Celestar
Of course you need them, they'll just be zoomed. You cannot load and unload smaller and bigger sprites all the the time the viewport is zoomed. In memory, there is just ONE size of sprites.
Celestar
Posted: 18 Jan 2007 09:46
by brupje
Celestar wrote:Of course you need them, they'll just be zoomed. You cannot load and unload smaller and bigger sprites all the the time the viewport is zoomed. In memory, there is just ONE size of sprites.
Celestar
The way it is now is, for each sprite a few zoom levels are defined. For example: if you have a sprite name house, you have house_z0, house_z1, house_zn. So there are different sprites for each level.
Posted: 18 Jan 2007 10:51
by doktorhonig
Some people will beat me for that, but with an opengl-renderer this problem would probably be gone. Texture Compression (which is lossy, but not that bad) would divide the memory consumption by 4, so the 256pixel Images could be kept in Texture Memory of a 128MB Card. The 128pixel Images in one quarter of that.
Rendering Speed wouldn't be a problem anymore, because some thousand polygons are actually nothing. We would just use available GPU-Power, und leave more CPU time for other tasks.
Posted: 18 Jan 2007 10:55
by brupje
doktorhonig wrote:Some people will beat me for that, but with an opengl-renderer this problem would probably be gone. Texture Compression (which is lossy, but not that bad) would divide the memory consumption by 4, so the 256pixel Images could be kept in Texture Memory of a 128MB Card. The 128pixel Images in one quarter of that.
Rendering Speed wouldn't be a problem anymore, because some thousand polygons are actually nothing. We would just use available GPU-Power, und leave more CPU time for other tasks.
what's keeping you from starting then? ;p
Posted: 18 Jan 2007 11:11
by doktorhonig
ATM my diploma thesis, but you're right, I should probably try it on my own.
I just wanted to point out, that having 490 MB of memory
for sprites only is actually more high end than a 128MB OpenGL Accelerator.
Posted: 18 Jan 2007 11:40
by brupje
doktorhonig wrote:ATM my diploma thesis, but you're right, I should probably try it on my own.
I just wanted to point out, that having 490 MB of memory
for sprites only is actually more high end than a 128MB OpenGL Accelerator.
Well I seriously doubt it's necessary to have 490MB loaded
I played myself with openGL rendering, but it failed to reach the performance/quality OTTD currently has. But then again, I am not an openGL programmer ;p
Posted: 18 Jan 2007 13:03
by Alltaken
well everyone should have at least 2Gb of ram so 490Mb should be nothing these days
Alltaken