Longer maps vs bigger maps
Moderator: OpenTTD Developers
Longer maps vs bigger maps
Seems that in a near feature we'll have bigger maps in the official version since Korenn already do it for his own.
I'd like to know what others players think about big maps. Of course a lot of us like them, but do you prefer bigger square maps (1024x1024, 512x512) or, like me, rectangular maps (1024x512, 512x256)?
I don't know if it's possible to make rectangular maps in a easy way, I just hope so.
I'd like to know what others players think about big maps. Of course a lot of us like them, but do you prefer bigger square maps (1024x1024, 512x512) or, like me, rectangular maps (1024x512, 512x256)?
I don't know if it's possible to make rectangular maps in a easy way, I just hope so.
Good news, because between huge map and micro-map, we should prefer a medium personalised sizedominik81 wrote:Bigger maps in the official OpenTTD tree will most likely be dynamic, so you could chose any size you like. Don't expect bigger maps too soon though.

I hope it will not too long to implement in the official project

- Born Acorn
- Tycoon
- Posts: 7596
- Joined: 10 Dec 2002 20:36
- Skype: bornacorn
- Location: Wrexham, Wales
- Contact:
I think they should be dynamically. It would be so hard to programm that (I think
), if the support of bigger maps is finished. It will just be a the adjustment of a parameter I guess
This way everybody would be able to make their own size.
However, there should be a limitation, so a maximum size.
gr. Bas


However, there should be a limitation, so a maximum size.
gr. Bas
I'd think that processing time and memory requirements for a map size m*n are O(m*n), so doubling one dimension doubles those requirements. A map of 1024*1024 is 16 times larger than the current 256*256, and would probably be more than enough for most mapmakers. If there is no limit, and people suddenly try creating things like 4096*4096 maps, there's just no need, unless you're modelling reality to the point where a 3-hour train journey takes 3-hours of real-time in the game.
Bugzilla available for use - PM for details.
You're all underestimating the complexity of OTTD. Computers got better since 1995, but not as much as some of you might think. A regular game, early years, uses around 6% of my CPU power (Athlon 2100+, Radeon 9500 Pro, 1 GB RAM). Multiply that with 16 (1024x1024 map) and you start to run into problems. You start to have other drawbacks. Korenn's bigger maps e.g. took much more time to update the tiles. Another option would be to limit vehicles or decrease the recursion level for the AI.
Besides, from a developers point of view a game is supposed to be stable. And without any limits to the map size you'd certainly be able to crash it very easily.
Besides, from a developers point of view a game is supposed to be stable. And without any limits to the map size you'd certainly be able to crash it very easily.
Actually, they have - it's just that software is bloating so much that it's not that noticeableComputers got better since 1995, but not as much as some of you might think.

People are forgetting that doubling map size quadruples, not doubles, processing time, though

Our processors are easily 16+ times as fast as in 1995. Windows 95 had just come out, many of us were still running 486's! We're talking 33-100 MHZ here - My 2.5 GHz computer is 25x faster in clockspeed alone! That's not even accounting for architechtural differences between the 486 and early Pentiums compared to a modern Pentium! My computer is easily 50+ times more powerful than an 100 MHz 486.
That means that I could handle a 1792x1792 map as easily as a 100 MHZ 486 could handle a 256x256 map.
Nobody says you have to use the larger map sizes!But we don't all have hte latest CPUs.
Crashing == memory leaks, not simply taking more memory. If you have Windows NT/2000/XP, it'll slow down, maybe even refuse to run OpenTTD, but I doubt it'll crash.And without any limits to the map size you'd certainly be able to crash it very easily.
2048x2048 would probably be a reasonable limit for now. It would be pushing today's fastest personal computers to the limits, but it wouldn't kill them.
Last edited by CobraA1 on 29 Apr 2004 17:28, edited 1 time in total.
"If a man does not keep pace with his companions, perhaps it is because he hears a different drummer. Let him step to the music he hears, however measured or far away" --Henry David Thoreau
An unlimited map size would be impossible - our computers are finite - you'd easily hit physical limits such as the size of an integer if you did that
. No, there must be limits.

"If a man does not keep pace with his companions, perhaps it is because he hears a different drummer. Let him step to the music he hears, however measured or far away" --Henry David Thoreau
You mean make it even more stupid than in the original?dominik81 wrote:Another option would be to limit vehicles or decrease the recursion level for the AI.

As for size limits, we ned to look at need as wel as ability. It's one thing to have a map of 2048x2048 tiles, but would anyone creating scenarios have time to fill that reasonably? Do you really need maps that big? Can you get your services running in such a large area before getting to the point of making billions per day?
OMG, please don't tell me we're sticking with this AI!!Another option would be to limit vehicles or decrease the recursion level for the AI.
You mean make it even more stupid than in the original?
No way, drop the current AI and put something better in! I can think of at least a dozen ways to make networks that are better than the current algorithm!
"If a man does not keep pace with his companions, perhaps it is because he hears a different drummer. Let him step to the music he hears, however measured or far away" --Henry David Thoreau
Who is online
Users browsing this forum: No registered users and 15 guests