QGIS and PostGIS were my jam when I worked in that space. We were an ESRI shop so Oracle/SQL Server with SDE (topped with ArcGIS) were the official tools. Some of us were always looking for ways to subvert the culture by building tools based on open source stacks.
One of my favorite experiences from that era: We were meeting with a few ESRI reps for some integration stuff. The lead hot-shot was on his phone playing around during the meeting. He was basically on autopilot. The other two folks were working with GeoJSON response convertors. I said, "I built one of those with TopoJSON". One guy said, "I've never heard of it". I showed them how it was much more efficient and used splines instead of points. The lead dropped his phone and said, "I need you to tell me MORE about that". I showed them the service. They invited me to lunch, I politely declined and said, "today's my last day so I have a ton of things to wrap up". I do miss that realm sometimes.
Not having done much in terms of GIS work, I never had to deal with ESRI until last week.
I was on a call to ask some ESRI rep to add some labeled points to my client's existing map tool. It was a somewhat surreal, weird experience where I got the feeling they were making the work seem much more difficult than it was. Their estimate turned out to insanely off the (my) mark, at eye-watering hourly rates.
At first I thought they had enough business and didn't really care about us. But reading this thread, it seems I was wrong. They are the Oracle of the GIS industry.
Is there a better option for raw base tilesets? When making my app mapping world tides (https://solunar.pages.dev) I tried OSM for instance but it only had raster tiles at LoDPI resolutions available. ESRI on the other hand had full vector support in a variety of formats^ and provided great docs for integrating with FOSS rendering toolkits. The free tier seems generous, though once it runs out the price increases sharply.
^ For instance, a really neat one that renders ocean features in as much detail as the land typically gets
That author doesn't say Esri is a scourge; more like it's just generally bad to have only 1 provider in a space -- and that it's up to customers to change that by voting with their feet.
I think Esri is (and has been) in a very similar position to Microsoft's in the late 1990s -- having achieved market dominance, they feel like open source software is the biggest threat to their business. But I think the presence of QGIS is creating competition, which is nothing but good for the industry.
Wow, these are gorgeous! Can I ask where you get your DEM data and at what resolution? I’ve been wanting to play around with some relief maps of various bioregions (eg the Great Lakes watershed, Cascadia, etc), but I’ve had trouble figuring out where to find data at the right resolution
Where do we get it?
Only publicly available sources. Usgs has a great portal. Private data is too expensive to get. I was quoted 6 figures for a larger area. They were going to fly a plane and capture it :)
What resolution?
Totally depends on the area the customer would like to cover. If it’s their ranch or property, we usually need 1-meter. If it’s a mountain range than 30-Meter works.
It mainly depends on the resolution limit for 3D printing. So it also depends on the size of the model they want.
Unfortunately not all areas are covered with high res
What are you using to get it into an STL? I've had OK results with DEMto3D plugin, but it has some weird artifacts that I can't seem to get rid of. Are there any better options you're aware of?
I actually use DEMto3D. It's touchy, but I do post-work on the .stl/3d model in blender so it works out ok for me.
If you have weird artifacts, I'm guessing that is due to the underlying data vs QGIS itself. Have you looked at their documentation (https://demto3d.com/en/)?
You have fewer products for sale than I was expecting.
I live in NE Los Angeles, which has the Verdugo and San Gabriel mtns, plus Mt Washington and other modest peaks - I think it would look great in this style. Especially because all the development would be excluded.
For every new location you do the process looks like:
1. Get the data and prep it for print (fixed)
2. 3D print it (fixed)
3. Rubber Mold (fixed)
4. Wax Model (variable)
5. Bronze (variable)
Steps 1-3 are 40-60% of the costs. So I haven't put the money out of pocket yet to put up new locations. I've let customer's ask first and then do them.
+1 to this, I can think of a few places id order for but i couldnt figure out how. Did i miss the link? Looks like it refers to "custom orders" but i wasnt sure how that works.
use these plates on a larger scale to make terra-cotta impressions and turn them into 'chia-pets' such that you can grow micro greens on the landscape of a particular area.
The moment I graduated and lost access to ArcGIS I got out QGIS and become ridiculously empowered. Yea Arc has Python APIs for some things but QGIS, while rough in the UI department, gives you powerful access to everything. And it plays so well with other things: check out my presentation on using QGIS in robotics (pdf) https://roscon.ros.org/2018/presentations/ROSCon2018_Unleash...
QGIS is very powerful, but it's not exactly user friendly. The UI is hard to learn, and important functionality is buried several layers deep in non-intuitive drop-down menus and buttons.
I think a simplified GIS program with stripped-down functionality and an intuitive UI could be a big hit. Think of SketchUp versus SolidWorks or ProE.
In many ways that is what we're building at Felt (where I work). We don't offer much in the way of analysis, but we make it very easy (IMO of course) to share your maps.
I think part of the question (and problem) is, which part of QGIS do you focus on? The digitizing and geometry parts? The map-rendering part? Georeferencing? Vector and raster analysis?
Or to put it another way, what's the workflow that a simpler, more opinionated interface would solve, and what features could or would you sacrifice for it?
The most obvious ones. I'm still very new to GIS, but I find it amazing how difficult it is to simply draw some rectangles (of some specified dimensions) and lines on a map, and cleanly label the edges with distances. Annotating, moving, rotating things... is quite a pain compared to modern UIs. So I wonder if there's a slim and beginner-friendly open source version of QGIS.
Some years ago QGIS was useful to me in spite of its UI: I got a PDF map of certain infrastructure that was not georeferenced, and I was able to correct its projection to Web Mercator with great acccuracy. But at the end I was somehow unable to find out how to export the data! QGIS did show me with the transformation matrix it had built and I wrote a program that applied that matrix to my files.
In the vast majority of cases you cannot turn a PDF into spatial data, except for georeferencing it as an image. There are some exceptions, such as when the PDF already includes georeferenced vector data, but its very unlikely you had that sort of file given the issue you described.
You likely either:
- needed to export an image of the data (raster);
- needed to digitise the data within;
- needed to extract the vector data first from the pdf and work out how to deal with it.
Opening a pdf without appropriate data structures into a gis software package is akin to taking a picture of a billboard that has a printed image of a map on it and then expecting to be able to do anything with that picture. The pdf is a bit better, but not by much unless it began life as a georeferenced pdf with vector data maintained.
No argument there.
I've worked with ArcGIS, Autocad, Sketchup, Civil3D, and QGIS. Though clumsy, QGIS is the most powerful by far of these applications.
I've been using QGIS a bit for simple things. Viewing OSM road data in a PostGIS database. Viewing geojson files. Everything has been more difficult than I expected, but I think that might just come with the territory. I'm working through PostGIS in Action to try to get the requisite background knowledge.
One thing that's frustrated me, and I want to make sure it isn't a misconfiguration on my end, is that QGIS feels really slow. For example, I have a 150MB geojson file which has 300,000 points with associated metadata. Even when I'm zoomed in such that I can only see a few thousand points at a time, if I pan the map over by 50% it takes at least 10 seconds before it loads in the new points. Many long operations seem to take place synchronously on the UI thread, so the whole app is unresponsive while they take place. Clicking the drop down arrow on a PostGIS Schema to view the tables spins for several minutes. No other Postgres tool I have takes that long, so it's not the database. The PostGIS import/export tools were also extremely slow and didn't have progress bars. I'm using 3.24 currently. I don't want to rag on it too hard, but it's really hampered my enthusiasm for working with maps and GIS.
QGIS is always a little more difficult than expected. Even as someone with a masters in GIS, its still hard for me to remember where they put things or how to do it. Its extremely powerful but can be extremely frustrating too.
GeoJSON is unfortunately one of the worst formats to keep the data you are using in. Its great for transport and interoperability, but there is no way to index the data in the format. It doesn't matter if you are only drawing a few thousand points, it is still looking through all of them to see which ones to draw.
GIS is awesome, and don't let one tool get you down. If its something interesting to you, there is a lot more to it than just QGIS. Part of my pain with QGIS is because I rarely ever use it so I forget what I learned last time. I spend most of my time purely in python, and don't really need to.
You'd probably be well-served by exporting to some format with a spatial index. GeoJson is just a textfile, not much better than reading in a CSV. The current recommended format is Geopackage (gpkg), which is based on SQLite.
Edit: agreed on the slowness of DB connections, I've found that too, including for relatively small locally-hosted DBs.
Yeah, converting it to gpkg sped it up a lot, thanks!
I assumed that since geojson is completely unsuitable for querying directly, it would load the whole thing into a native in-memory format, but perhaps not.
I don't think geojson is a great format for anything with more than a few MB of data.
I wanted to see exactly how bad it is with a largeish datasets, so I exported the New Zealand address dataset[1] with ~2.5M points as a geopackage (750MB) QGIS loads this fine its a little slow when viewing the entire country but when zoomed into city level it is almost instant to pan around.
Using ogr2ogr I converted it to ndgeojson (2.5GB), It crashed my QGIS while trying to load it. Using shuf I created a random 100,000 points geojson (~110MB) it was unbearably slow in QGIS while panning around 5+ seconds.
I currently use and recommend flatgeobuf[2] for most of my working datasets as it is super quick and doesn't need sqlite to read (eg in a browser).
It is also super easy to convert to/from with ogr2ogr
No, you're not victim of misconfiguration. QGIS is very slow with large datasets, no matter what source you use (even if you have Oracle with costly Spatial extensions). It is due to some unfortunate design decisions - the app always loads the full dataset from the source, and draws/shaws every single feature. It is possible (through some recent, obscure check in the prefs) to tell it to skip smaller objects when putting everything on the screen, but generally is not possible to tell it "do not load features from PostGIS that are going to render to only few pixels".
Another reason is that the underlying C++ engine is tied in such way with the fairly-large Python codebase, that it seldom uses all CPU cores. so you have a single process app. Add to this the fact that the GPU is not used extensively and you get one very slow APP.
So, yeah, for datasets larger than 50k points its slow on arbitrary hardware. Some analysis are impossible to run unless you go to PostGIS
To be honest, ESRI is not a company that everyone loves, but they really invest into ArcGIS Pro, even though they also are not there yet. Both QGIS and ESRI shy away from spatial SQL which is like times more effective for spatial analysis.
To be honest, QGIS is quite old-school in design, and the core devs know that it would take enormous effort to implement it from scratch... at least the slow parts. The fact that it works and is open source, does the fundamental things you need does not make it the _top_ software. this "opensource ftw" is really stupid when you have to work with hundreds of layers with hundred Ks of points.
Make sure you have estimated table metadata turned on, otherwise QGIS will run a bunch of queries to understand your tables I believe.
We typically use QGIS as a viewing engine only, if you let postgis do the heavy lifting it's a beautiful setup, especially with a tuned dB and a indexed clustered postgres table
> Make sure you have estimated table metadata turned on, otherwise QGIS will run a bunch of queries to understand your tables I believe.
That fixed it, thanks! Looking at the manual, there is a tip that tells you to turn that on, otherwise it will read the entire table to characterize the geometries... It's a bit mad that the default behavior has it querying potentially gigabytes of information across the network every time you open the app and click a dropdown, with no progress bar. But it's definitely the type of app where you need to read the manual, and it says so right there.
I’m slowly making a map of my back yard in QGIS. One annoyance: I live in a place where tectonic movement over my expected lifespan is significant on the scale of my map. And so are the supposed inaccuracies in various coordinate systems. (I want to record where things are so I can find them again without digging big holes!)
As far as I can tell, QGIS has no particular understanding of either a coordinate relative to the (moving!) crust or of a coordinate in space-time that can be projected to space at future or past times. Surely this should be a thing!
I found HTDP, a web tool that can shift coordinates forward and back in time:
(I have an RTK-capable GPS and NTRIP data via UNAVCO from a nearby CORS station. I was hoping that storing position relative to such-and-such CORS station as it was on such—and-such date would be straightforwardly doable in QGIS.)
Wow that’s a pretty interesting (and scary) challenge.
I know that the coordinate systems get revised every few years. In Australia we used to use AGD83 then it was revised to GDA94 and more recently to GDA2020 — and this was to account for tectonic shifts. And there’s calculations for transforming from one to the other.
That’s the only clues I have on that side of things and I’ve forgotten most of what I’ve known.
Also there’s a concept called “rubber sheeting” for transforming points where you know the amount of error at a bunch of points and you want to transform all the other points by interpolating how much error there would be at those points. The might be useful too, if you’re implementing a solution yourself.
I am doing the same, Mainly I wanted to record where pipes and such are located but might as well record buildings, fences and trees while there. I was not worried about geologic movement but all the global coordinate systems felt very clunky when locally mapping at the centimeter level, I finally settled on having a well defined surveyed reference point, and using meters north and east off that.
Perhaps that would work for you. because while the plate is shifting with respect to the globe, everything on that plate will maintain the same relative position.
I set up my own NTRIP base station at a fixed point in the middle of my roof with a hefty bracket, did a PPP survey to determine its location once, and am considering everything relative to that. This spring when I'm back at the surveying, I'll likely do another PPP run and make sure it hasn't moved too much. If it has, I guess I'll have to figure out how to reconcile that. From what I've gathered, I don't think this is too far off from how real surveying works.
I didn't see much reason to use a different unit than degrees. Although while I haven't gotten super in depth learning QGIS, it feels like there's an impedance mismatch in that it seems to be a 2D program that treats degrees as linear units (and then applies a fudge factor to degrees longitude), rather than a native 3D program. So I'm doing all of my collection, point storage, and calculation with scripts outside of QGIS, and then only pushing the cooked results to QGIS for visualization.
Instead of a reference + north/east meters, how about two reference points, and then everything is referenced as polar coordinates from there, point A to point B being 0 degrees. My concern would be if the direction of north/east changed over time from the plate movement.
I used to program python plugins for QGIS, I really miss it. It made me feel really useful, I'd write visualization algorithms one day and the next it was deployed to the whole company and people would email me telling me how much easier it made their job. I don't think I've ever felt this valued since, every other job I had was a small part of a big whole.
Hello, would love to chat with you about QGIS plugins. Email me at devjobs@anor.io if you would be interested in making a QGIS plugin for the solar PV industry :)
One of my favorite experiences from that era: We were meeting with a few ESRI reps for some integration stuff. The lead hot-shot was on his phone playing around during the meeting. He was basically on autopilot. The other two folks were working with GeoJSON response convertors. I said, "I built one of those with TopoJSON". One guy said, "I've never heard of it". I showed them how it was much more efficient and used splines instead of points. The lead dropped his phone and said, "I need you to tell me MORE about that". I showed them the service. They invited me to lunch, I politely declined and said, "today's my last day so I have a ton of things to wrap up". I do miss that realm sometimes.
https://blog.cleverelephant.ca/2018/11/esri-dominates.html
I was on a call to ask some ESRI rep to add some labeled points to my client's existing map tool. It was a somewhat surreal, weird experience where I got the feeling they were making the work seem much more difficult than it was. Their estimate turned out to insanely off the (my) mark, at eye-watering hourly rates.
At first I thought they had enough business and didn't really care about us. But reading this thread, it seems I was wrong. They are the Oracle of the GIS industry.
^ For instance, a really neat one that renders ocean features in as much detail as the land typically gets
I think Esri is (and has been) in a very similar position to Microsoft's in the late 1990s -- having achieved market dominance, they feel like open source software is the biggest threat to their business. But I think the presence of QGIS is creating competition, which is nothing but good for the industry.
(I worked as a developer for Esri for 15 years)
I've had difficulty making it work for production services (especially extensions). Ultimately moved more towards the Python/OSM stack.
Such a fun space.
We make 3D Maps of American Landscapes in bronze.
We take Digital Elevation Model (DEM) data, do light transformations in QGIS and convert it to an .STL file before additional 3D modeling.
Our latest project was a hairy one doing Oahu (https://terramano.co/blogs/product/oahu-bronze-3d-map)
What resolution? Totally depends on the area the customer would like to cover. If it’s their ranch or property, we usually need 1-meter. If it’s a mountain range than 30-Meter works.
It mainly depends on the resolution limit for 3D printing. So it also depends on the size of the model they want.
Unfortunately not all areas are covered with high res
https://www.usgs.gov/the-national-map-data-delivery/gis-data...
They have full US coverage and many infill sets at higher resolutions.
They have 1 metre DEMs ( ~ one elevation per three foot x three foot square )
https://www.sciencebase.gov/catalog/items?q=&filter=tags=Dig...
and more if you get to know their community and products (they are a firehose of likely more data than many can afford to reliably store).
(Near) global coverage, 90m resolution. Easy to fetch tiles with a script.
If you have weird artifacts, I'm guessing that is due to the underlying data vs QGIS itself. Have you looked at their documentation (https://demto3d.com/en/)?
I outline how the whole process works here https://www.gregkamradt.com/gregkamradt/2020/2/29/manufactur...
You have fewer products for sale than I was expecting.
I live in NE Los Angeles, which has the Verdugo and San Gabriel mtns, plus Mt Washington and other modest peaks - I think it would look great in this style. Especially because all the development would be excluded.
For every new location you do the process looks like: 1. Get the data and prep it for print (fixed) 2. 3D print it (fixed) 3. Rubber Mold (fixed) 4. Wax Model (variable) 5. Bronze (variable)
Steps 1-3 are 40-60% of the costs. So I haven't put the money out of pocket yet to put up new locations. I've let customer's ask first and then do them.
Surprisingly, most of our orders have been custom
Here's my info packet on the custom process https://docs.google.com/document/d/1IkiHG_Z5JS03mWYHv-KNAhi8...
Anyway they look beautiful.
Do you have a video link of what you're referring to?
I once tried to use the molds to make chocolate representations of the mountains ha! I learned the hard way that tempering is difficult for a novice
https://blog.qgis.org/2023/01/16/crowd-funding-call-2023/
I think a simplified GIS program with stripped-down functionality and an intuitive UI could be a big hit. Think of SketchUp versus SolidWorks or ProE.
Here's a couple maps that I have made about the Honoulu marathon that I wouldn't have made in a more complex/time-intensive piece/powerful piece of software: - https://felt.com/map/UNOFFICIAL-Honolulu-Marathon-2022-TCg9C... - https://felt.com/map/UNOFFICIAL-Honolulu-Marathon-2022-Road-...
Or to put it another way, what's the workflow that a simpler, more opinionated interface would solve, and what features could or would you sacrifice for it?
You likely either: - needed to export an image of the data (raster); - needed to digitise the data within; - needed to extract the vector data first from the pdf and work out how to deal with it.
Opening a pdf without appropriate data structures into a gis software package is akin to taking a picture of a billboard that has a printed image of a map on it and then expecting to be able to do anything with that picture. The pdf is a bit better, but not by much unless it began life as a georeferenced pdf with vector data maintained.
These two already dominate more casual mapping
One thing that's frustrated me, and I want to make sure it isn't a misconfiguration on my end, is that QGIS feels really slow. For example, I have a 150MB geojson file which has 300,000 points with associated metadata. Even when I'm zoomed in such that I can only see a few thousand points at a time, if I pan the map over by 50% it takes at least 10 seconds before it loads in the new points. Many long operations seem to take place synchronously on the UI thread, so the whole app is unresponsive while they take place. Clicking the drop down arrow on a PostGIS Schema to view the tables spins for several minutes. No other Postgres tool I have takes that long, so it's not the database. The PostGIS import/export tools were also extremely slow and didn't have progress bars. I'm using 3.24 currently. I don't want to rag on it too hard, but it's really hampered my enthusiasm for working with maps and GIS.
GeoJSON is unfortunately one of the worst formats to keep the data you are using in. Its great for transport and interoperability, but there is no way to index the data in the format. It doesn't matter if you are only drawing a few thousand points, it is still looking through all of them to see which ones to draw.
GIS is awesome, and don't let one tool get you down. If its something interesting to you, there is a lot more to it than just QGIS. Part of my pain with QGIS is because I rarely ever use it so I forget what I learned last time. I spend most of my time purely in python, and don't really need to.
Edit: agreed on the slowness of DB connections, I've found that too, including for relatively small locally-hosted DBs.
I assumed that since geojson is completely unsuitable for querying directly, it would load the whole thing into a native in-memory format, but perhaps not.
I wanted to see exactly how bad it is with a largeish datasets, so I exported the New Zealand address dataset[1] with ~2.5M points as a geopackage (750MB) QGIS loads this fine its a little slow when viewing the entire country but when zoomed into city level it is almost instant to pan around.
Using ogr2ogr I converted it to ndgeojson (2.5GB), It crashed my QGIS while trying to load it. Using shuf I created a random 100,000 points geojson (~110MB) it was unbearably slow in QGIS while panning around 5+ seconds.
I currently use and recommend flatgeobuf[2] for most of my working datasets as it is super quick and doesn't need sqlite to read (eg in a browser).
It is also super easy to convert to/from with ogr2ogr
ogr2ogr -f flatgeobuf output.fgb input.geojson
[1] https://data.linz.govt.nz/layer/105689-nz-addresses/data/ [2] https://github.com/flatgeobuf/flatgeobuf
Another reason is that the underlying C++ engine is tied in such way with the fairly-large Python codebase, that it seldom uses all CPU cores. so you have a single process app. Add to this the fact that the GPU is not used extensively and you get one very slow APP.
So, yeah, for datasets larger than 50k points its slow on arbitrary hardware. Some analysis are impossible to run unless you go to PostGIS
To be honest, ESRI is not a company that everyone loves, but they really invest into ArcGIS Pro, even though they also are not there yet. Both QGIS and ESRI shy away from spatial SQL which is like times more effective for spatial analysis.
To be honest, QGIS is quite old-school in design, and the core devs know that it would take enormous effort to implement it from scratch... at least the slow parts. The fact that it works and is open source, does the fundamental things you need does not make it the _top_ software. this "opensource ftw" is really stupid when you have to work with hundreds of layers with hundred Ks of points.
We typically use QGIS as a viewing engine only, if you let postgis do the heavy lifting it's a beautiful setup, especially with a tuned dB and a indexed clustered postgres table
That fixed it, thanks! Looking at the manual, there is a tip that tells you to turn that on, otherwise it will read the entire table to characterize the geometries... It's a bit mad that the default behavior has it querying potentially gigabytes of information across the network every time you open the app and click a dropdown, with no progress bar. But it's definitely the type of app where you need to read the manual, and it says so right there.
As far as I can tell, QGIS has no particular understanding of either a coordinate relative to the (moving!) crust or of a coordinate in space-time that can be projected to space at future or past times. Surely this should be a thing!
I found HTDP, a web tool that can shift coordinates forward and back in time:
https://www.ngs.noaa.gov/TOOLS/Htdp/Htdp.shtml
And I found this discussion:
https://www.gpsworld.com/the-effects-of-tectonic-plate-movem...
But I haven’t found anything easy to use.
(I have an RTK-capable GPS and NTRIP data via UNAVCO from a nearby CORS station. I was hoping that storing position relative to such-and-such CORS station as it was on such—and-such date would be straightforwardly doable in QGIS.)
I know that the coordinate systems get revised every few years. In Australia we used to use AGD83 then it was revised to GDA94 and more recently to GDA2020 — and this was to account for tectonic shifts. And there’s calculations for transforming from one to the other.
That’s the only clues I have on that side of things and I’ve forgotten most of what I’ve known.
Perhaps that would work for you. because while the plate is shifting with respect to the globe, everything on that plate will maintain the same relative position.
I set up my own NTRIP base station at a fixed point in the middle of my roof with a hefty bracket, did a PPP survey to determine its location once, and am considering everything relative to that. This spring when I'm back at the surveying, I'll likely do another PPP run and make sure it hasn't moved too much. If it has, I guess I'll have to figure out how to reconcile that. From what I've gathered, I don't think this is too far off from how real surveying works.
I didn't see much reason to use a different unit than degrees. Although while I haven't gotten super in depth learning QGIS, it feels like there's an impedance mismatch in that it seems to be a 2D program that treats degrees as linear units (and then applies a fudge factor to degrees longitude), rather than a native 3D program. So I'm doing all of my collection, point storage, and calculation with scripts outside of QGIS, and then only pushing the cooked results to QGIS for visualization.
I'd kill to find a full remote job doing that.