OSGeo Planet

CARTO Blog: CARTO's Data Observatory 2.0, now powered by Google BigQuery

OSGeo Planet - Wed, 2019-10-16 09:00
Today, at the Spatial Data Science Conference, we presented the recently launched Data Observatory 2.0 (DO). A new platform to discover and consume location data for spatia...
Categories: OSGeo Planet

Fernando Quadro: Curso Online de GeoServer em Dezembro

OSGeo Planet - Tue, 2019-10-15 14:36

Caros leitores,

Quero convidá-los a participarem do Curso Online de GeoServer que estarei ministrando pela GEOCURSOS. O objetivo do curso é que você aprenda a disponibilizar, compartilhar e editar dados geográficos na internet com o GeoServer.

No curso serão abordados tópicos como: configuração de dados, criação de estilo com SLD, padrões OGC, interface administrativa (web), visualização cartográfica com OpenLayers, REST API, Segurança, entre outros.

O curso ocorrerá entre os dias 03 e 12 de dezembro (terças, quartas e quintas) das 20:00 as 22:00 (horário de Brasília).

Aqueles que poderem divulgar para seus contatos, agradeço. Quem quiser saber mais informações sobre o curso, pode obtê-las nos seguintes links:

Categories: OSGeo Planet

MapTiler: OpenStreetMap in WGS84

OSGeo Planet - Tue, 2019-10-15 11:00
Thumb

Global maps based on OpenStreetMap in WGS84 map projection are available on MapTiler Cloud via maps API. This complements the already existing maps in Web Mercator projection, which is de-facto the industry standard.

Why there are so many map projections?

Earth is not flat. Therefore, we need to mathematically solve the issue of projecting a globe into 2D space. It is possible to do in many ways, but all solutions, which are called map projections, have pros and cons.

Over time, Web Mercator become de-facto the industry standard. However, like any other map projections, it also has downsides: the most visible one is the distortion of sizes. Areas near poles are displayed much bigger while equatorial zone appears much smaller than in reality.

Various local/national coordinate systems were created because many maps were historically showing only limited territory. Therefore, they use projection which fits best to a specific territory. Local coordinate systems are still used by governments and required for official maps.

Some of the map projections

Some of the map projections. Image ©Tobias Jung CC BY-SA 4.0

Vector and raster OSM tiles in EPSG:4326

To fulfill specific needs, MapTiler Cloud is now adding base maps derived from OpenStreetMap in alternative coordinate systems. Base maps are available in WGS84.

https://api.maptiler.com/tiles/v3-4326/{z}/{x}/{y}.pbf?key={YOUR-OWN-KEY}

There is an OpenLayers 6 vector viewer code snippet, raster tiles rendered on-demand, WMTS service for use in desktop GIS software and static maps API. 

Maps in Lambert, Rijksdriehoekstelsel and other national coordinate systems

From the local coordinate systems, French Lambert, Dutch Rijksdriehoekstelsel are available and many others will come soon!

MapTiler Cloud is also able to host maps in any coordinate system with EPSG code. Just upload your map in GeoPackage format or create one with the new MapTiler Desktop 10.2.

OpenStreetMap in French Lambert and Dutch Rijksdriehoekstelsel map projections

OpenStreetMap in French Lambert and Dutch Rijksdriehoekstelsel map projections

Free maps API with local coordinate systems

Start using maps in WGS84, Lambert or Rijksdriehoekstelsel for free via MapTiler Cloud.

Categories: OSGeo Planet

gvSIG Team: Program of the 15th International gvSIG Conference is now available

OSGeo Planet - Mon, 2019-10-14 16:27

The program of the 15th International gvSIG Conference is now available. It includes a a great number of presentations and several free workshops for both users and developers.

Conference will take place from November 6th to 8th at School of Engineering in Geodesy, Cartography and Surveying (Universitat Politècnica de València, Spain), and registrations have to be done from the form available at the event website.

Registration for workshops will be independent, and they will be able to be made from October 15th. All the workshops information will be published at gvSIG Blog soon.

Categories: OSGeo Planet

gvSIG Team: Ya está disponible el programa de las 15as Jornadas Internacionales gvSIG

OSGeo Planet - Mon, 2019-10-14 16:21

El programa de las 15as Jornadas Internacionales gvSIG está ya disponible con una gran variedad de ponencias y varios talleres gratuitos, tanto para usuarios como para desarrolladores.

Las jornadas tendrán lugar del 6 al 8 de noviembre en la Escuela Técnica Superior de Ingeniería Geodésica, Cartográfica y Topográfica (Universitat Politècnica de València, España), y para poder asistir es necesario inscribirse previamente desde el formulario habilitado en la web del evento. Se recomienda no esperar al último momento, ya que las salas cuentan con un aforo limitado.

Para los talleres gratuitos se deberá realizar una inscripción independiente, que podrá realizarse a partir del día 15 de octubre. Toda la información relativa a los mismos la podréis encontrar en el blog de gvSIG.

Categories: OSGeo Planet

Oslandia: (Fr) Rechercher une adresse avec QGIS

OSGeo Planet - Mon, 2019-10-14 09:14

Sorry, this entry is only available in French.

Categories: OSGeo Planet

PostGIS Development: PostGIS 3.0.0rc2

OSGeo Planet - Sun, 2019-10-13 00:00

The PostGIS development team is pleased to release PostGIS 3.0.0rc2. This will be the final RC before release.

This release works with PostgreSQL 9.5-12 and GEOS >= 3.6

Best served with PostgreSQL 12 , GEOS 3.8.0 and pgRouting 3.0.0-alpha.

Continue Reading by clicking title hyperlink ..
Categories: OSGeo Planet

XYCarto: Experimenting with Hydrological Analysis using TauDEM

OSGeo Planet - Thu, 2019-10-10 21:47

Blue Rivers Ordered

Over the past few years, I’ve played around with developing ordered rivers networks for different projects. I am not an expert in hydrology, but I can get close for cartographic purposes. I am an expert; however, in asking for help from those who know best and I rely on a lot of very smart people to guide me on my journey.

Recently, I decided to put together a visualization of ordered rivers for New Zealand. I came across a very nice data set offered through the Ministry for the Environment via the Koordinates website and thought I’d like to put it to use.

The rivers vis project made me wonder if I could build this base dataset myself using some of the recently released elevation data sets via the LINZ Data Service. The short answer to my question is “sorta”. Doing it open source is not an issue, but building an accurate ordered river centerline network is another story. This is a task I cannot take on as a solo project right now, but I could do a little experimentation. Below, I’ll offer some of methods and things I learned along the way.

Tools and Data

The method I tested used TauDEM and a 1m DEM raster accessed from the LINZ Data Service. I down sampled the DEM to 2m and 5m resolutions and used small areas for testing. Finding and open source tool was easy. I sorted through a few available methods and finally landed on “Terrain Analysis Using Digital Elevation Models” (TauDEM). There are additional methods through GRASS and SAGA GIS. I chose TauDEM because I never used it before.

Method Tested

To my knowledge, there is no open source tool where a person can put in a DEM and get a networked rivers centerline vector out the other side. It requires a number of steps to achieve your goal.

The basic run down to process the DEM is to:

  1. Fill sinks
  2. Determine flow directions
  3. Determine watersheds
  4. Determine flow accumulation
  5. Stream classification
  6. Export to vector

TauDEM does require a few extra steps to complete the process, but these steps are explained in the documentation of the tool. It was more about keeping all my variables in the right places and using them at the right time. I recommend using the variable names TauDEM provides.

BASH script here

Click the arrow to the left to view the full BASH script below:

#!bin/bash #Rough sketch for building river centerlines. Rasters have been clipped prior BASEPATH=/dir/path/to/base raster_list=$( find $BASEPATH -name "*.tif" ) taudem_outputs=/dir/path/to/outputs reso=resolution_number for i in $raster_list do

INPUT_RASTER=$i file_name=$( basename $i ) strip_input_extension=$( echo $file_name | sed 's/.tif//' ) reso_name=$taudem_outputs/${strip_input_extension}_${reso}res gdal_translate -tr $reso $reso -of GTiff $i $reso_name.tif fel=${reso_name}_fel.tif p=${reso_name}_p.tif sd8=${reso_name}_sd8.tif ad8=${reso_name}_ad8.tif ang=${reso_name}_ang.tif slp=${reso_name}_slp.tif sca=${reso_name}_sca.tif sa=${reso_name}_sa.tif ssa=${reso_name}_ssa.tif src=${reso_name}_src.tif ord=${reso_name}_strahlerorder.tif tree=${reso_name}_tree.dat coord=${reso_name}_coord.dat net=${reso_name}_network.shp w=${reso_name}_watershed.tif processed_input_file=$reso_name.tif #TauDEM Commands mpiexec -n 8 pitremove -z $processed_input_file -fel $fel mpiexec -n 8 d8flowdir -fel $fel -p $p -sd8 $sd8 mpiexec -n 8 aread8 -p $p -ad8 $ad8 -nc mpiexec -n 8 dinfflowdir -fel $fel -ang $ang -slp $slp mpiexec -n 8 areadinf -ang $ang -sca $sca -nc mpiexec -n 8 slopearea -slp $slp -sca $sca -sa $sa mpiexec -n 8 d8flowpathextremeup -p $p -sa $sa -ssa $ssa -nc mpiexec -n 8 threshold -ssa $ssa -src $src mpiexec -n 8 streamnet -fel $fel -p $p -ad8 $ad8 -src $src -ord $ord -tree $tree -coord $coord -net $net -w $w done

The script is a rough sketch, but does get results.

Challenges in the Process

One major challenge for this project was the size of the input DEM and my computers available RAM. I work primarily off a laptop. It’s a good machine but no match for a proper server set up with some spacious RAM. My laptop struggled with the large hi-resolution DEMs, so I needed to down-sample the images and choose a smaller test area to get it to work.

Clip the tiff with gdal_translate -projwin and down sample with -tr

gdal_translate -tr xres yres -projwin ulx uly lrx lry input.tif output.tif

The second challenge came up because I used a bounding box to clip my test regions. I recommend not doing this and instead clip your regions using a watershed boundary. Having square shapes for your test regions will give you very inaccurate and unhelpful results. For example, major channels in your DEM will be cut at the edges of your raster. You will not get accurate results.

Clipping a raster using a shapefile, like a watershed boundary, can be achieved using gdalwarp.

gdalwarp –cutline input.shp input.tif output.tif

Results

I ran my process and QCed the results against Aerial Imagery and a hillshade I developed from the DEM. The first run gave me good enough results to know I have a lot of work to do, but I did manage to develop a process I was happy with. The tool did a great job, but the accuracy of the DEM was a little more challenging. It’s a start. I captured a good number of river channels despite my incorrect usage of a square DEM, learned a lot about how DEM resolution affects outputs, and gained knowledge around how to spot troublesome artifacts.

Well Defined ChannelsImg 1: River capture in well defined channel.

From this experiment, there are a few ideas I’d like to explore further:

1. Accuracy of the DEM. The particular DEM I worked with had a number of ‘dams’ in the flows. Notably, bridges, culverts, vegetation artifacts, and other general errors that caused water to flow in interesting directions. When working with a data set like this, I am curious how manage these artifacts.

Road issueImg 1: River diversion at road.

Artifact issueImg 1: River diversion at culvert or bridge.

2. How to go beyond borders. This analysis can be broken down by watershed, but it will be necessary to link the outflows of those watersheds to the next for accurate results.

Edge issueImg 1: Flow not captured at edge.

3. As DEMs are released with better resolution, there is a need for scaled up computing power. The process needs a large amount of RAM. What is the best computational set up for capturing the largest area?

4. Did I do this correctly? I perform this task about once every two years and usually weekends when the surf is flat and the garden is weeded, so I am not an expert. There is a lot more research to be done to determine if I am using the tools to the best of their abilities.

Categories: OSGeo Planet

CARTO Blog: Spatial Data Simplified: Introducing Data Observatory 2.0

OSGeo Planet - Thu, 2019-10-10 09:00
Data is an essential ingredient for spatial analysis—which is predicated on access to your own data, plus useful third-party data. In the end, spatial analysis is about put...
Categories: OSGeo Planet

CARTO Blog: CARTO and SafeGraph Team Up for Behavior Pattern Insights

OSGeo Planet - Wed, 2019-10-09 09:00
Today, perhaps more than ever before, people are on the move. Transitioning between roles as consumers, parents, employees, tourists, and more means making a lot of stops i...
Categories: OSGeo Planet

PostGIS Development: PostGIS 3.0.0rc1

OSGeo Planet - Tue, 2019-10-08 00:00

The PostGIS development team is pleased to release PostGIS 3.0.0rc1.

This release works with PostgreSQL 9.5-12 and GEOS >= 3.6

Best served with PostgreSQL 12 , GEOS 3.8.0rc2 and pgRouting 3.0.0-alpha.

Continue Reading by clicking title hyperlink ..
Categories: OSGeo Planet

Oslandia: QGIS Versioning now supports foreign keys!

OSGeo Planet - Mon, 2019-09-30 07:52

QGIS-versioning is a QGIS and PostGIS plugin dedicated to data versioning and history management. It supports :

  • Keeping full table history with all modifications
  • Transparent access to current data
  • Versioning tables with branches
  • Work offline
  • Work on a data subset
  • Conflict management with a GUI

QGIS versioning conflict management

In a previous blog article we detailed how QGIS versioning can manage data history, branches, and work offline with PostGIS-stored data and QGIS. We recently added foreign key support to QGIS versioning so you can now historize any complex database schema.

This QGIS plugin is available in the official QGIS plugin repository, and you can fork it on GitHub too !

Foreign key support TL;DR

When a user decides to historize its PostgreSQL database with QGIS-versioning, the plugin alters the existing database schema and adds new fields in order to track down the different versions of a single table row. Every access to these versioned tables are subsequently made through updatable views in order to automatically fill in the new versioning fields.

Up to now, it was not possible to deal with primary keys and foreign keys : the original tables had to be constraints-free.  This limitation has been lifted thanks to this contribution.

To make it simple, the solution is to remove all constraints from the original database and transform them into a set of SQL check triggers installed on the working copy databases (SQLite or PostgreSQL). As verifications are made on the client side, it’s impossible to propagate invalid modifications on your base server when you “commit” updates.

Behind the curtains

When you choose to historize an existing database, a few fields are added to the existing table. Among these fields, versioning_ididentifies  one specific version of a row. For one existing row, there are several versions of this row, each with a different versioning_id but with the same original primary key field. As a consequence, that field cannot satisfy the unique constraint, so it cannot be a key, therefore no foreign key neither.

We therefore have to drop the primary key and foreign key constraints when historizing the table. Before removing them, constraints definitions are stored in a dedicated table so that these constraints can be checked later.

When the user checks out a specific table on a specific branch, QGIS-versioning uses that constraint table to build constraint checking triggers in the working copy. The way constraints are built depends on the checkout type (you can checkout in a SQLite file, in the master PostgreSQL database or in another PostgreSQL database).

What do we check ?

That’s where the fun begins ! The first thing we have to check is key uniqueness or foreign key referencing an existing key on insert or update. Remember that there are no primary key and foreign key anymore, we dropped them when activating historization. We keep the term for better understanding.

You also have to deal with deleting or updating a referenced row and the different ways of propagating the modification : cascade, set default, set null, or simply failure, as explained in PostgreSQL Foreign keys documentation .

Nevermind all that, this problem has been solved for you and everything is done automatically in QGIS-versioning. Before you ask, yes foreign keys spanning on multiple fields are also supported.

What’s new in QGIS ?

You will get a new message you probably already know about, when you try to make an invalid modification committing your changes to the master database

Error when foreign key constraint is violated

Partial checkout

One existing Qgis-versioning feature is partial checkout. It allows a user to select a subset of data to checkout in its working copy. It avoids downloading gigabytes of data you do not care about. You can, for instance, checkout features within a given spatial extent.

So far, so good. But if you have only a part of your data, you cannot ensure that modifying a data field as primary key will keep uniqueness. In this particular case, QGIS-versioning will trigger errors on commit, pointing out the invalid rows you have to modify so the unique constraint remains valid.

Error when committing non unique key after a partial checkout

Tests

There is a lot to check when you intend to replace the existing constraint system with your own constraint system based on triggers. In order to ensure QGIS-Versioning stability and reliability, we put some special effort on building a test set that cover all use cases and possible exceptions.

What’s next

There is now no known limitations on using QGIS-versioning on any of your database. If you think about a missing feature or just want to know more about QGIS and QGIS-versioning, feel free to contact us at infos+data@oslandia.com. And please have a look at our support offering for QGIS.

Many thanks to eHealth Africa who helped us develop these new features. eHealth Africa is a non-governmental organization based in Nigeria. Their mission is to build stronger health systems through the design and implementation of data-driven solutions.

Categories: OSGeo Planet

QGIS Blog: User question of the Month – Sep’19

OSGeo Planet - Sat, 2019-09-28 10:43

After the summer break, we’re back with a new user question.

This month, we want to focus on documentation. Specifically, we’d like to know how you learn how to use QGIS.

The survey is available in English, Spanish, Portuguese, French, Ukrainian, and Indonesian. If you want to help us translate user questions into more languages, please get in touch on the community mailing list!

Categories: OSGeo Planet

Jo Cook: FOSS4GUK 2019

OSGeo Planet - Sat, 2019-09-28 10:00
FOSS4GUK (https://uk.osgeo.org/foss4guk2019/) came and went a week or so ago, in Edinburgh, and to my mind it was a game-changer for our UK events. This is not going to be a detailed post about how great it was (yes it was great), and how good the venue was (also great), but a reflection on how it was different. For one, there were 250 attendees, which is a step up from previous events.
Categories: OSGeo Planet

Jackie Ng: Announcing: mapguide-react-layout 0.12.3

OSGeo Planet - Sat, 2019-09-28 07:23
This bugfix release fixes a bug where the initial map view (if specified in a flexible layout) is parsed incorrectly, breaks the viewer as a result.

Project Home Page
Download
mapguide-react-layout on npm
Categories: OSGeo Planet

PostGIS Development: PostGIS 3.0.0beta1

OSGeo Planet - Sat, 2019-09-28 00:00

The PostGIS development team is pleased to release PostGIS 3.0.0beta1.

This release works with PostgreSQL 9.5-12RC1 and GEOS >= 3.6

Best served with PostgreSQL 12RC1 and GEOS 3.8.0beta1 both of which came out in the past couple of days.

Continue Reading by clicking title hyperlink ..
Categories: OSGeo Planet

Fernando Quadro: Lançado o OpenLayers 6.0

OSGeo Planet - Fri, 2019-09-27 12:05

Prezados leitores,

Ontem, por volta das 22h Tim Schaub anunciou no GitHub do OpenLayers que a tão aguardada (pelo menos por mim) versão 6.0 está disponível oficialmente. Foram mais de 1.780 commits e 540 pull requests desde a versão 5.3.

Dentre as novidades, um recurso importante nesta versão é a capacidade de compor camadas com diferentes tipos de renderizador. Anteriormente, o mapa usava uma única estratégia de renderização, e todas as camadas do seu mapa tinham que implementar essa estratégia.

Agora é possível ter um mapa com camadas que usam diferentes tecnologias de renderização. Isso possibilita, por exemplo, que a camada Canvas (2D) seja composta junto com uma camada baseada em WebGL no mesmo mapa. Também é possível criar camadas com renderizadores personalizados. Portanto, você pode ter um mapa que use outra biblioteca (como d3) para renderizar uma camada e usar o OpenLayers para renderizar as outras camadas.

Além disso, a versão 6.0 inclui várias melhorias na renderização de vector tiles e deve consumir um quantidade menor de memória em geral. A versão também inclui vários recursos experimentais que ainda não fazem parte da API estável, como um novo renderizador baseado em WebGL e a função experimental useGeographic().

Esta versão inclui várias alterações incompatíveis com versões anteriores. Desta forma é importante ler as notas do release para verificar o que mudou a partir da versão 5.3.

Fonte: GitHub do OpenLayers

Categories: OSGeo Planet

CARTO Blog: Spatial Leaders from Around the Globe Converge on SDSC19

OSGeo Planet - Fri, 2019-09-27 09:00
The 3rd annual Spatial Data Science Conference is just three weeks away and we couldn’t be more excited. Being held at Columbia University, this year’s iteration of the for...
Categories: OSGeo Planet

From GIS to Remote Sensing: SCP Tips: Temporary Directory

OSGeo Planet - Fri, 2019-09-27 08:00
Tips about the Semi-Automatic Classification Plugin for QGISProcessing satellite images requires a large amount of disk space for temporary files (required for processing but deleted afterward). In case your system disk has low space available, you can change the SCP temporary directory to a different location with large disk space available.


For any comment or question, join the Facebook group about the Semi-Automatic Classification Plugin.
Categories: OSGeo Planet

Jackie Ng: MapGuide 4.0 Showcase: Transform all the things!

OSGeo Planet - Thu, 2019-09-26 09:28
Did you know that MapGuide uses CS-Map, a coordinate system transformation library with support for many thousands of coordinate systems out of the box? No matter how esoteric the map projection your data is in, MapGuide can re-project geospatial data in and out of it thanks to this library.

So it is quite a shame that for the longest time, MapGuide's powerful coordinate system transformation capabilities has zero representation in any part of the mapagent (the HTTP API provided by MapGuide).

To use MgCooordinateSystem and friends (the classes that wrap the CS-Map library) requires building a your own MapGuide application using the MapGuide Web API in one of our supported languages (.net, Java or PHP) and having your client application call into your own MapGuide application. There is nothing out of the box in the mapagent for a client map viewer application to do a basic HTTP request for transforming coordinates or requesting feature data to be transformed to a certain coordinate system.

For MapGuide 4.0, we've exposed the coordinate system transformation capabilities in APIs where it makes sense. Such as our SELECTFEATURES APIs for returning feature data. In my previous showcase post, I've shown how appending VERSION=4.0.0 and CLEAN=1 now gives you an intuitive GeoJSON result for this API.



The coordinates are based on the coordinate system of the data source it comes from. In the above screenshot, this is latitude/longitudes (code: LL84, epsg: 4326).

Suppose you want this data in web mercator (code: WGS84.PseudoMercator, epsg: 3857) so you can easily plonk this GeoJSON onto a slippy map with OpenStreetMap or your own XYZ tiles, which would also be in web mercator. With MapGuide 4.0, you can now also include a TRANSFORMTO=WGS84.PseudoMercator parameter to your HTTP request and that GeoJSON data will be transformed to web mercator.


The other major capability gap is that the mapagent offers no API over the mapagent for transforming a series of coordinates from one coordinate system to another. For MapGuide 4.0, there's now a dedicated API for doing this: The CS.TRANSFORMCOORDINATES operation.

With this operation, simply feed it:

  • A comma-separated list of space-separated coordinate pairs (COORDINATES)
  • The CS-Map code of the source coordinate system the above coordinates are in (SOURCE)
  • The CS-Map code of the target coordinate system to transform to (TARGET)
  • Your desired output format of JSON or XML (FORMAT)
  • If you want JSON, whether you want the pretty JSON version (CLEAN)
Invoking the operation will give you back a collection of transformed coordinates.


A place where this new API is sorely needed is MapGuide Maestro, which previously required kludgy workarounds for transforming bounding boxes in Map Definitions (when adding layers in different coordinate systems, or re-computing extents):The next milestone of MapGuide Maestro will now take advantage of this new API for transformation when connected to a MapGuide 4.0 or newer server.
Categories: OSGeo Planet
Syndicate content