Mapshaper

mapshaper

Mapshaper

Ok. 

I wanna know why have I never heard of this online tool before. Like, what the hell is wrong with the social media? Is something wrong with Twitter or Instagram or something that they never caught on mapshaper? Or was it just me and my hazardous ignorance, yet again?

Have you tried this free nifty online tool that literally simplify crazy complicated shapefile polygons like it’s no one’s business?!

It started with some last minute inspiration on how to collate data from 3 different regions; developed from remote sensing techniques which vary from one another. The common output here is to turn all of them into a vector file; namely shapefile, and start working on the attribute to ease merging of the different shapefile layers.

Once merged, this shapefile is to be published as a hosted feature layer into the ArcGIS Online platform and incorporated into a webmap that serves as a reference data to configure/design a dashboard. What is a dashboard? It's basically an app template in ArcGIS Online that summarizes all the important information in your spatial data. It's a fun app to create, no coding skills required. Check out the gallery here for reference:

Operations Dashboard for ArcGIS Gallery

There are two common ways to publish hosted feature layer into ArcGIS Online platform.

Method 1: Zip up the shapefile and upload it as your content. This will trigger the command inquiring if you would like to publish it as a hosted feature layer. You click 'Yes' and give it a name and et voila! You have successfully publish a hosted feature layer.

Method 2: From an ArcGIS Desktop or ArcGIS Pro, you publish them as feature service (as ArcMap calls them) or web layer (as the its sister ArcGIS Pro calls them). Fill up the details and enabling the function then hit 'Publish' and it will be in the platform should there be no error or conflicting issues.

So, what was the deal with me and mapshaper? 

🛑 A fair warning here and please read these bullet points very carefully:

I need you to remember...I absolve any responsibility of what happens to your data should you misinterpreted the steps I shared. 

Please always  👏🏻  BACK 👏🏻 UP  👏🏻  YOUR 👏🏻 DATA. Don’t even try attempting any tools or procedure that I am sharing without doing so. Please. Cause I am an analyst too and hearing someone else forget to save their data or create a backup is enough to make me die a little inside. 

For this tool, please export out the attribute table of your shapefile because this tool will CHANGE YOUR SHAPEFILE ATTRIBUTES. 

When I was publishing the vector I have cleaned and feature-engineered via ArcGIS Pro...it took so long that I was literally dying inside. I'm not talking about 20 minutes or an hour. It took more than 12 hours and it did not conjure the 'Successfully published' notification as I would've expected from it.

So at around 5.30 am, I randomly type 'simplify shapefily online free'. Lo and behold, there was mapshaper.

All I did was, zip up my polygon, drag it to the homepage and it will bring you to the option of choosing the actions that will be executed while the data is being imported into mapshaper:

detect line intersections

snap vertices

This option will help you to detect the intersections of lines within your vector/shapefile. This can help identify topological error.

The option to snap vertices will snap together points of similar or almost identical coordinate system. But it does not work with TopoJSON formats.

Mapshaper

There is something interesting about this options too; you can enter other types of customized options provided by the tool from its command line interface! But hold your horses peeps. I did not explore that because here, we want to fix an issue and we'll focus on that first. I checked both options and import them in.

Mapshaper

This will bring the to a page where there you can start configuring options and method to simplify your vector.

To simplify your shapefile, you can have both options to prevent the shape of the polygon being compromised; prevent shape removal, and to utilize the planar Cartesian geometry instead of the usual geoid longitude and latitude; use planar geometry. The implication of the second option is not obvious to me yet since all I wanted was to get the data simplified for easy upload and clean topology, thus, I chose both options to maintain the shape and visibility of all my features despite the highest degree of simplification.

Alike to the options of methodology for simplication in the mainstream software, I can see familiar names:

Douglas-Peuker

Visvalingam / effective area

Visvalingam / weighted area

First and foremost, I had no slightest idea of what these were. Like for real. I used to just go first for the default to understand what sort of output it will bring me. But here, the default; Visvalingam / weighted area, seemed like the best option. What are these methodologies of simplification? There are just algorithms used to help simplify your vectors:

🎯 Douglas-Peucker algorithm decimates a curve composed of line segments to a similar curve with fewer points (Ramer-Douglas-Peucker algorithm, Wikipedia; 2021).

🎯 Visvalingam algorithm is a line simplication operator that works eliminating any less significant points of the line based on effective area concept. That basically means that the triangle formed by each of the line points with two of its immediate neighboring points (Visvalingam Algorithm | aplitop).

🎯 Visvalingam algorithm with weight area is another version of Visvalingam algorithm of subsequent development where an alternative metrics is used and weighted to take into account the shape (Visvalingam & Whelan, 2016).

For reasons I can't even explain, I configured my methodology to utilize the third option and now that I have the time to google it, Thank God I did.

Mapshaper

Then, see and play with the magic at the 'Settings' slider where you can adjust and view the simplification made onto the vector! I adjusted it to 5%. The shape retained beautifully. And please bear in mind, this vector was converted from a raster. So, what I really wanted is the simplified version of the cleaned data and to have them uploaded.

Now that you've simplified it, export it into a zipped folder of shapefile and you can use it like any other shapefile after you extracted it.

Remember when I say you have got to export your table of attributes out before you use this tool? Yea...that's the thing. The attribute table will shock you cause it'll be empty. Literally. With only the OBJECTID left. Now, with that attribute table you've backed up, use the 'Join Table' tool in ArcGIS Pro or ArcMap and join the attribute table in without any issues.

Phewh!!

I know that it has alot more functions than this but hey, I'm just getting started. Have you ever done anything more rocket science than I did like 2 days ago, please share it with the rest of us. Cause I gotta say, this thing is cray!! Love it so much.

mapshaper developer, if you're seeing this, I 🤟🏻 you!

UPDATE

I have been asked about the confidentiality of the data. I think this is where you understand the reason behind the fact that they will work even with using just the ‘.shp’ file of the shapefile since _that_ is the vector portion of the shapefile. 

Shapefile is a spatial data format that is actually made up of 4 files; minimum. Each of these files share the same name with different extensions; .prj, .shx, .shp and .dbf. Although I am not familiar with what .shx actually accounts for, the rest of them are pretty straightforward:

.prj: stores the projection information

.dbf: stores the tabulated attributes of each features in the vector file

.shp: stores the shape/vector information of the shapefile. 

So, as the tool indicate, it actually helps with the vector aspect of your data which is crucial in cartography. 

More Posts from Azaleakamellia and Others

1 year ago
🌱 Google Earth Engine 101

🌱 Google Earth Engine 101

Uploading a shapefile as an asset in GEE and making use of it

🟢 Beginner-friendly.

🆓 Free with no hidden monetary cost.

🤚🏻 Requires registration so sign-up 👉🏻https://signup.earthengine.google.com/, access via browser and Internet connection

🖥️ Available for Windows, Mac and Linux.

Google Earth Engine or lovingly called GEE is another free and open platform provided by Google to provide a very vast and comprehensive collection of earth observation data. Since Sentinel-2 is no longer available for download at USGS Earth Explorer, I find the alternative too challenging for me so GEE seems like the easiest way to go. If you're looking for a one-stop platform to access satellite imagery for free, GEE is a great place to start. You don't have to learn JavaScript explicitly to start using this tool.


Tags
3 years ago

🗻 Diorama Drama

So, John Nelson made this absolutely simple tutorial on how to create a diorama interesting features on Earth's surfaces. I have been eyeing alot of people making them and resisting the urge to make one. I mean, I have the habit of falling into a rabbit hole with making maps that I'll definitely be having too much fun to stop. That could mean hours and hours scrutinizing colors, perspective or setups and even legend arrangements. But this...I decided not to overdo it and just make one.

I can't believe how easy it is to make one! Here's a piece of many to come; the elevation diorama of area within the Batang Ai Dam and Batang Ai National Park in Sarawak, East Malaysia.

🗻 Diorama Drama

What you'll learn... 1️⃣ How to extract raster from existing image layer in Living Atlas; that's how we extracted the DEM layer for this elevation data.

2️⃣ How to create a 3D scene in ArcGIS Pro and navigate through them.

3️⃣ How to create a layout layer and export them as a static image format i.e jpeg, png, tif etc...

Check out the video tutorial here 👇🏻

I was thinking that this tutorial is making me feel like there is a possibility of creating some floating island or pixel-like models of isolated pieces of land -- my head is imagining some kind of dragons flying around in some nonexistent Viking fantasy map. But hey...if you're not inspired to create crazier maps from John's tutorial, then are you even a cartographer at all?

He's on Twitter, LinkedIn, YouTube and he even has this supercool blog of his; Adventures in Mapping, where he shares all of his tutorials in full documentation and gorgeous portfolio that could only consist of magic! One such as I can only aspire.

So, that was what our Sunday looked like and I think I'll be playing around with dioramas for a few weeks. Let's see just how many of these I can make 'til the next Sunday.


Tags
2 years ago

[2022] 30 Day Map Challenge -- FAILED

[2022] 30 Day Map Challenge -- FAILED
[2022] 30 Day Map Challenge -- FAILED

Last year, I participated once again in the 30 Day Map Challenge that was going around in Twitter-ville come November. It is the 3rd attempt at the marathon and 2022 served as a reminder that progressed too despite getting stuck at Day 3 as life caught up with me.

I don't like the idea that I have left the challenge incomplete, again. It was not my priority and I work better with clear goals or visions of expected output. If it does not add to my need to learn something new ...it will be a task bound to head straight to the backburner. Let's resolve to make it a long-term routine instead of a spurt of stress trying to make the deadline.

As a consequence, I am attuning this task into one that actually gives me the benefit out putting into record the techniques and tools I used to make the maps in writing. I believe that will serve more purpose and added value other than visuals. And perhaps, have some stock ready for submission this year instead.

Anyone else participated in this challenge back in November? How did you do and what would you like to do better for the next one? Don't be shy and do drop a word or two.


Tags
1 year ago

🌱 Google Earth Pro 101

Google Earth Pro is one of the most powerful freely available software one can use for location investigation. If you're a non-tech GIS user who needs to know just enough to get your work going, then let's hit the ground running with this tutorial for starters.

🟢 Beginner-friendly.

🆓 Free with no hidden monetary cost.

🖥️ Available for Windows, Mac and Linux.


Tags
2 years ago

Uninspired

Kuching City Road Network (Saturday, 10/02/2023)

I am a reckless uninspired person. I call myself a map-maker but I don't really get to make maps for reasons that I don't think I should venture outside of my requesters' requests. But mostly, I am compelled to get it right and I feel good if I can deliver what they need. The thing is, I no longer get spontaneously inspired to make maps anymore. Just as the rules become clearer the more you read books on cartography, fear just crop themselves up like 'Plant vs Zombies' 🌱 in PlayStation.

So, I am scared that I'm beginning to wear off my excitement about making map; really making them and not just knowing how to make them.

What sort of idea is great? I mean, what should I focus on trying to make? There are so many data out there that what I will attempt may be missing the train or just pale in comparison to other incredible work. I don't really mind it but I'm not that young to not understand self-esteem does ease the thinking process.

Can't say much, I mean...30 Days of Map Challenge hasn't been all that well with me. I should've prepared something before the event event started. I quit after the 3rd challenge cause I overthink and get panic attacks every time I feel I'm doing stuff half-ass.

Despite all that, I am lucky to have aggressively supportive siblings. They just can't seem to stop the tough love and always kicking me to just barf something out.

'It's the process that matters!'

When did I start forgetting how wonderful the process, huh?


Tags
3 years ago

Community Empowerment Strategy Dashboard 2021 | WWF-Malaysia

Community Empowerment Strategy Dashboard (2021)

Tool: Operations Dashboard ArcGIS, Survey123 for ArcGIS, ArcGIS Online Technique: XLSForm programming, web application development

The northern highland communities of Lun Bawang have been collaborating with WWF-Malaysia under the Sarawak Conservation Programme (SCP) to empower sustainable economies and managing their natural biodiversity through the Community Empowerment Strategy (formerly known as Community Engagement and Education Strategy).

Since 2016, the communities have been actively mapping out their land uses and culturally important locations to delineate their areas of settlement and source of livelihood. Given the close vicinity of their communities to the licensed timber concessions, producing a definitive map is important to preserve and conserve their surrounding natural capitals.

Several outreach has been done and the community mapping effort has been shifted to implement citizen science via the Survey123 for ArcGIS mobile application which is apart of the ArcGIS ecosystem. This enables the local community to collect information despite the lack of network reception and the data can still be synchronized upon availability automatically or manually shared with the field officers.

📌 Availability: Retracted in 2021


Tags
3 years ago

Python: Geospatial Environment Setup (Part 2)

Python: Geospatial Environment Setup (Part 2)

Hey again folks! I am here for the second part of Python environmental setup for a geospatial workspace. I published the first part of this post two weeks ago. So if you've not yet read that, I'll catch you up to speed with our checklist:

Install Python ☑

Install Miniconda ☑

Install the basic Python libraries ☑

Create a new environment for your workspace

Install geospatial Python libraries

🗃 Create a new environment for your workspace

Since we have actually manually set up our base environment quite thoroughly with all the basic libraries needed, to make our work easier, we can just clone the base environment and install all the additional essential libraries needed for geospatial analysis. This new environment will be called geopy. Feel free to use a name you identify most with.

Why don't we just create a new environment? Well, it means we have to start installing the Python libraries again from scratch. Although it is no trouble to do so, we want to avoid installing so many libraries all at once. As I mentioned in Part 1, there is always a risk where incomplete dependencies in one library will affect the installation of other libraries that you intend to install in one go. Since we already have a stable and usable base environment, we can proceed to use it as a sort of pre-made skeleton that we will build our geospatial workspace with.

1️⃣ At the Anaconda Command Prompt, type the following:

Python: Geospatial Environment Setup (Part 2)

2️⃣ Press Enter and the environment will be clone for you. Once it is done, you can use the following command to check the availability of your environment 👇🏻

Python: Geospatial Environment Setup (Part 2)

You should be able to see your geopy environment listed along with the base environment.

👩🏻‍💻 Install geospatial Python libraries

Here we will proceed with the installation of a few geospatial Python libraries that are essential to reading and exploring the vectors and rasters.

🔺 fiona: This library is the core that some of the more updated libraries depend on. It is a simple and straightforward library that reads and writes spatial data in the common Python IOs without relying on the infamous GDAL's OGR classes.

🔺 shapely: shapely library features the capability to manipulate and edit spatial vector data in the planar geometric plane. It is one of the core libraries that recent geospatial Python libraries rely on to enable the reading and editing of vector data.

🔺 pyproj: is the Python interface for the cartographic projections and coordinate system libraries. Another main library that enables the 'location' characteristics in your spatial data to be read.

🔺 rasterio: reads and writes raster formats and provides a Python API based on Numpy N-dimensional arrays and GeoJSON.

🔺 geopandas: extends the pandas library to allow spatial operations on the geometric spatial data i.e shapefiles.

💀 As you might have noticed, we won't be doing any direct gdal library installation. It's mainly due to the fact that its installation is a process that seems to be accompanied by misery at every turn and involved workarounds that are pretty inconsistent for different individuals. Does it mean that we won't be using it for our Pythonic geospatial analysis? Heck no. But we will be taking advantage of the automatic dependency installation that comes with all the libraries above. The rasterio library depends on gdal and by installing it, we integrate the gdal library indirectly into our geospatial environment. I found that this method is the most fool-proof. Let's proceed to the installation of these libraries.

1️⃣ At the Anaconda Command Prompt, should you start from the beginning, ensure that your geopy environment is activated. If not, proceed to use the following command to activate geopy.

Python: Geospatial Environment Setup (Part 2)

Once activated, we can install the libraries mentioned one after another. Nevertheless, you also have the option of installing them in one go directly using a single command 👇🏻

Python: Geospatial Environment Setup (Part 2)

💀 geopandas is not included in this line-up NOT because we do not need it. It's another temperamental library that I prefer to isolate and install individually. If gdal is a rabid dog...then geopandas is a feral cat. You never know how-when-why it doesn't like you and forces a single 10-minute installation drag to hours.

3️⃣ Once you're done with installing the first line-up above, proceed with our feral cat below 👇🏻

Python: Geospatial Environment Setup (Part 2)

4️⃣ Use the conda list command again to check if all the libraries have been installed successfully.

🎉Et voilá! Tahniah! You did it!🎉

🎯 The Jupyter Notebook

It should be the end of the road for the helluva task of creating the geospatial environment. But you're going to ask how to start using it anyway. To access this libraries and start analyzing, we can easily use the simple and straight-forward Jupyter Notebook. There are so many IDE choices out there but for data analysis, Jupyter Notebook suffices for me so far and if you are not familiar with Markdown, this tool will ease you into it slowly.

Jupyter Notebook can be installed in your geopy environment as follows:

Python: Geospatial Environment Setup (Part 2)

And proceed to use it by prompting it open via the command prompt

Python: Geospatial Environment Setup (Part 2)

It ain't that bad, right? If you're still having problems with the steps, do check out the real-time video I created to demonstrate the installation. And feel free to share with us what sort of problems you have encountered and the workaround or solutions you implemented! It's almost never a straight line with this, trust me. As mentioned in the previous post, check out the quick demo below 👇🏻

🌏 Python: Geospatial Environment Setup 🎉
YouTube
A deconstructed time-lapsed Python geospatial environment setup with `conda` from downloading necessary software to loading the Jupyter Note

See you guys again for another session on geospatial Python soon!


Tags
1 year ago
🌱 Google Earth Engine 101

🌱 Google Earth Engine 101

Viewing, filtering and downloading Landsat 9 satellite image

🟢 Beginner-friendly.

🆓 Free with no hidden monetary cost.

🤚🏻 Requires registration so sign-up 👉🏻https://signup.earthengine.google.com/, access via browser and Internet connection

🖥️ Available for Windows, Mac and Linux.

If you're down the rabbit hole with the how-to documents on #googleearthengine and still want to find a simple basic code to start with, feel free to check out this 16-minute video I've made for myself to remember some of the syntaxes. There's no better way to understand it than doing it yourself so, I've also included links to the script app and code in GitHub 👇🏻

⌨️ Code: https://github.com/hrllscrt/gee/tree/...

🌏 GEE sandbox: https://code.earthengine.google.com/a...

The script demonstrates how to utilize readily available data in the Earth Engine data repository like LSIB 2017 for demarcation and I've taken the liberty to play around with Landsat 9. The code should be applicable for Landsat 8 too, for those who wants to use it to extract data straight away.

Don't be shy to share your questions if you have any and have fun dissecting the data!


Tags
4 years ago

wildlife study design & analysis

Wildlife Study Design & Analysis

To cater for my lack of knowledge in biological data sampling and analysis, I actually signed up for the 'Wildlife Study Design and Data Analysis' organized by Biodiversity Conservation Society Sarawak

So, this new year, I've decided to take it down a notch and systematically choose my battlefield. Wildlife species data has always been mystery at me. As we all know, biologists hold them close to their hearts to the point of annoyance sometimes (those movies with scientists blindly running after some rare orchids or snakes or something like that really wasn't kidding). Hey...I get it and I totally agree - the data that belongs to the organization has to be treated with utmost confidentiality and all by the experts that collects them. Especially since we all know that they are not something so easily retrieved. Even more so, I optimistically support for the enthusiasm to be extended to their data cleaning and storing too while they're at it. But it doesn't mean I have to like the repercussions. Especially not when someone expects a habitat suitability map from me and I have no data to work with and all I had is a ping-pong game of exchanging jargon in the air with the hopes that the other player gets what you mean cough up something you can work with. Yes...there is not a shred of shame here when I talk about how things work in the world, but it is what it is and I'm not mad. It's just how it works in the challenging world of academics and research. 

To cater for my lack of knowledge in biological data sampling and analysis, I actually signed up for the 'Wildlife Study Design and Data Analysis' organized by

Biodiversity Conservation Society Sarawak (BCSS for short)

or

Pertubuhan Biodiversiti Konservasi Sarawak

It just ended yesterday and I can't say I did not cry internally. From pain and gratitude and accomplishment of the sort. 10 days of driving back and forth between the city center and UNIMAS was worth the traffic shennanigans.  

It is one of those workshops where you really do get down to the nitty-gritty part of understanding probability distribution from scratch; how to use it for your wildlife study data sampling design and analyzing them to obtain species abundance, occupancy or survival. And most importantly, how Bayes has got anything to do with it. I've been hearing and seeing Bayesian stats, methods and network on almost anything that involves data science, R and spatial stats that I am quite piffed that I did not understand a thing. I am happy to inform that now, I do. Suffice to say that it was a bootcamp well-deserved of the 'limited seats' reputation and the certificate really does feel like receiving a degree. It dwindles down to me realizing a few things I don't know:

I did not know that we have been comparing probabilities instead of generating a 'combined' one based on a previous study all these years.

I did not know that Ronald Fisher had such strong influence that he could ban the usage of Bayesian inference by deeming it unscientific.

I did not know that, for Fisher, if the observation cannot be repeated many times and is uncertain, then, the probability cannot be determined - which is crazy! You can't expect to shoot virus into people many times and see them die to generate probability that it is deadly!

I did not know that Bayes theorem actually combines prior probability and the likelihood data you collected on the field for your current study to generate the posterior probability distribution!

I did not know that Thomas Bayes was a pastor and his theory was so opposed to during his time. It was only after Ronald Fisher died that Bayesian inference gain favor especially in medical field. 

I did not know...well...almost anything at all about statistics!

It changed the way I look at statistics basically. But I self-taught myself into statistics for close to 9 years and of course I get it wrong most of the time; now I realize that for the umpph-th time. And for that, I hope the statistics power that be forgives me. Since this boot camp was so effective, I believe it is due to their effort in developing and executing the activities that demonstrates what probability distribution models we were observing. In fact, I wrote down the activities next to the topic just to remember what the deal was. Some of the stuffs covered are basics on Binomial Distribution, Poisson Distribution, Normal/Gaussian Distribution, Posterior probability, Maximum Likelihood Estimate (MLE), AIC, BACI, SECR, Occupancy and Survival probability. Yes...exhausting and I have to say, it wasn't easy. I could listen and distracted by paper falling for a fraction of time just to find myself lost in the barrage of information. What saved me was the fact that we have quizzes that we have to fill in to evaluate our understanding of the topic for the day and discuss them first thing in the next session. Best of all, we were using R with the following packages: wiqid, unmarked, rjags and rasters. Best locations for camera traps installation was discussed as well and all possible circumstances of your data; management and collection itself on the field, were covered rigorously. 

For any of you guys out there who are doing wildlife study, I believe that this boot camp contains quintessential information for you to understand to design your study better. Because once the data is produced, all we can do it dance around finding justification of some common pitfalls that we could've countered quite easily. 

In conclusion, not only that this workshop cast data analysis in a new light for me, but it also helps establishes the correct steps and enunciates the requirements to gain most out of your data. And in my case, it has not only let me understand what could be going on with my pals who go out into the jungle to observe the wildlife first hand, it has also given me ideas on looking for the resources that implements Bayesian statistics/methods on remote sensing and GI in general. Eventhough location analysis was not discussed beyond placing the locations of observation and occasions on the map, I am optimistic in further expanding what I understood into some of the stuff I'm planning; habitat suitability modeling and how to not start image classification from scratch...every single time if that's even possible. 

For more information on more workshops by BCSS or wildlife study design and the tools involved, check out the links below:

Biodiversity Conservation Society Sarawak (BCSS) homepage: https://bcss.org.my/index.htm

BCSS statistical tutorials: https://bcss.org.my/tut/

Mike Meredith's home page: http://mikemeredith.net/

And do check out some of these cool websites that I have referred to for more information as well as practice. Just to keep those brain muscles in loop with these 'new' concepts:

Statistical Rethinking: A Bayesian Course with Examples in R and Stan: https://github.com/rmcelreath/statrethinking_winter2019

Probability Concepts Explained: Introduction by Jonny Brooks-Bartlett: https://towardsdatascience.com/probability-concepts-explained-introduction-a7c0316de465 

Probability Concepts Explained: Maximum Likelihood Estimation by Jonny Brooks-Bartlett: https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1

Probability Concepts Explained: Bayesian Inference for Parameter Estimation by Jonny Brooks-Bartlett 

I'll be posting some of the things I am working on while utilizing the Bayesian stats. I'd love to see yours too!

P/S: Some people prefer to use base R with its simple interface, but if you're the type who works better with everything within your focal-view, I suggest you install RStudio. It's an IDE for R that helps to ease the 'anxiety' of using base R. 

P/S/S: Oh! Oh! This is the most important part of all. If you're using ArcGIS Pro like I do, did you know that it has R-Bridge that can enable the accessibility of R workspace in ArcGIS Pro? Supercool right?! If you want to know more on how to do that, check out this short 2 hour course on how to get the extension in and an example on how to use it: 

Using the R-Bridge: https://www.esri.com/training/catalog/58b5e417b89b7e000d8bfe45/using-the-r-arcgis-bridge/


Tags
Loading...
End of content
No more pages to load
azaleakamellia - anecdata
anecdata

#gischat #eo #running #simblr #cartokantoi

45 posts

Explore Tumblr Blog
Search Through Tumblr Tags