Please try again. All URLs are being redirected to https since yesterday, which doesn’t seem to work well with the “remote control” feature. So for now the remote control link has been switched back to use http only.
Maybe someone should do an experiment to find out …
I guess it was a simple misunderstanding, but it’s not true that you aren’t allowed to use Romanian language just because someone else already did. Nobody here has a monopoly on using any real world language.
Just imagine we had a rule like this for English. Probably half of all users would be in violation of it.
Oops, forgot to re-enable rendering after doing some maintenance work. Everything should appear soon.
Should be fixed now; was probably a remnant of the earlier flooding. OGF’s tile expiry mechanism isn’t particularly well suited for the case of flooding, which causes tiles that have been rendered in a flooded state to remain that way after the flooding itself has been fixed.
If there are still flooded tiles (after clearing your browser cache), please let me know.
As wangi and martinum4 said, the relation must be tagged instead of the outer ways. I just want to add that the disappearance has indeed been triggered by an update of the standard layer:
Yes, it does.
@niels20020: some sort of HistoricalMap display can probably be achieved with a Leaflet application that uses Overpass queries, similar to what we recently did with route relations.
I wrote a comment on that page, and the deleted it with a different (non-admin) user, both without problems. So the problem seems to be related to your user account rather then that specific page.
Now I’ve enabled debug logging for the wiki. Can you please save the page once again, preferably after logging off and clearing the browser cache?
No, I wasn’t aware. I assume this is the affected page:
Has it happened more than once? And what was the text of the error message? Generally, there doesn’t seem to be a problem with page creation (just created a new page without problem, and it does show up). Please let me know if it happens again.
No, I don’t actually think that interpolation is that complicated. I just wanted to point out that it’s unrealistic to assume constant speed over a complete route. (And that the information about staying time at the stops was missing.)
Doing complicated calculations, that’s actually what computers are for (and the fact that you used a spreadsheet is proof of that). IMO the preferred method would be the “trip_stop_intervals.txt” approach. That might be still somewhat unrealistic, because the time needed for a trip might vary over a day due to traffic conditions, but I’m OK with ignoring that. Maybe for every route there could be two or three variants.
The most complicated part would probably be to have some kind of “collision detection” to ensure that multiple trains don’t occupy the same station or travel in the same location at the same time.
Of course the best method would be to have real-world GPS data, because users want information about the actual location of the train they’re waiting for, but as we neither have a real world nor users who depend on that information. (Now that I think about it, not having a real world behind it might be the root cause of many of OGF’s problems …)
BTW, on the route_relations.html page, you can now zoom in to any location, and view the public transport relations of that area by pressing the “reload” button.
Are stops.txt, agency.txt, and routes.txt necessary? I’d think this information would be available directly in the relation/node data.
With stop_times.txt I definitely think more detail is needed. At least the duration of the stops must be known, Moreover, I can imagine lots of situations where some parts of a route allow for higher speed than others. (To make even more complicated: two ore more routes might share a single stop, which could cause waiting times.)
What I actually don’t understand is the fact that stop_times.txt specifies fixed “time of day” values instead of just the time difference (in seconds) from the start point.
Entitled or not, it’s not a secret that OGF itself is basically a copy of Openstreetmap. All necessary software components are freely available, and the setup process is documented reasonably well.
It might not be normal to demand information and tools for creating a copy, but the good news is that an OGF-like website can be created by copying OSM instead. The information and tools to do it are definitely out there, and the page linked by isleño gives a starting point.
Now I’m not saying that it’s easy. In fact, because of the architecture of several different software components working together, it’s actually somewhat complicated. Nevertheless, it can be done, but I wouldn’t attempt it without a solid base of Unix and RDBMS knowledge.
If the timetable URL is tagged in the route_master relation, then there’s no need for a unified naming schema. Having one giant wiki article is probably a bad idea. Of course people need to agree on a standard for the timetable content.
I didn’t actually think about scraping the data from the wiki, but rather ad-hoc downloading and extracting whenever a timetable URl is encountered in a relation object. Performance-wise, scraping might be the preferable approach, but that can be added later if it turns out to be necessary. Maybe all pages containing timetables should be put into a common category, so the Mediawiki API can be used to query them.
Another scraping method might be to run an Overpass query on all “route” or “route_master” relation containing the “timetable_url” tag.
What about putting the URL directly into the “route_master” or “route” relation, with a tag like “ogf:timetable_url” ?
The drawback with everyone hosting the timetable on their own, is probably that such a system will over time become prone to linik rot.
@Luciano - that’s indeed how I’d probably do it. Remember that martinum4 once linked to the GTFS specification in a recent thread:
Those contain a lot of elements that we don’t need, but I’d pick the relevant parts and convert them into some JSON structure, with one file per “route_master” relation, with the relation ID as filename.
That way it would be easy to load timetables for specific routes. I don’t see anything wrong with that. My perspective might well change if the number of timetables got very large and this simple approach became unfeasible, but I wouldn’t want to put much effort into a more elaborated solution unless that’s really necessary.
Possible next step from here: figure out a timetable for some routes, and then implement train objects moving along the ways. Proof of concept here:
MultiMaps integration has now happened, based on Subway’s latest version:
I had to make a few changes to the original source, mostly to replace some functions that won’t be available in the MultiMaps environment, so please everyone use the code in the updated “route_relations.html” as basis for further development.
I did some work preparing the MultiMaps inegration. The result can be viewed here:
Most of the code for route-drawing has been moved to
The MultiMaps extension is here:
The most relevant files are these:
But I wouldn’t even think too much about the MultiMaps integration yet. I’ll gladly take the integration part, once the script as such has reached some degree of maturity.
(Unfortunately I’m not qualified to recommend an IDE for Ubuntu.)