Survey123 Offline

survey123 offline

raindrop

Survey123 for ArcGIS is perhaps, one of those applications that superficial nerds like me would like; it's easy to configure, kiddie-level degree of customization with 'coding' (for that fragile ego-stroke) and user-friendly template to use. 

No app development/coding experience is required to publish a survey form and believe it or not, you can, personalize your survey to not look so meh. 

It took me some time to stumble through the procedures of enabling this feature before I understand the 'ArcGIS Online' ecosystem to which this app is chained to. 

So how do we do it? And why doesn't it work pronto?

This issue may be due to the fact that when we first start creating our forms, we go through the generic step-by-step procedures that leave little to imagination what was happening. Most of the time, we're too eager to find out how it really work. 

When we publish a Survey123 form; be it from the Survey123 website portal or the Survey123 Connect for ArcGIS software, we are actually creating and publishing a folder that contains a hosted feature layer and a form. It is on that hosted feature layer that we add, delete, update or edit data it. From ArcGIS Online, it looks like any feature service that we publish out of ArcGIS Desktop or ArcGIS Pro, save for the special folder it is placed in with a 'Form' file. 

To enable any offline function in any hosted feature layer in ArcGIS Online, you will need to enable the 'Sync' feature. So far, in many technical articles that I have gone through to learn how to enable this offline feature always goes back to 'Prepare basemaps for offline use'. It is a tad bit frustrating. But my experience when deal with 'Collector for ArcGIS' gave me the sense of epiphany when it comes to Survey123. So when you have prepared your Survey123 form for offline usage and it still doesn't work...do not be alarmed and let's see how to rectify the issue. 

1. Locate your survey's hosted feature layer

At your ArcGIS Online home page, click 'Content' at the main tab. We're going to go directly to your hosted feature layer that was generated for your survey when you published. 

Locate your survey folder. Click it open 

In the survey folder, navigate to the survey's hosted feature layer and click 'Options' button; the triple ellipses icon

At at the dropdown, click 'View item details'. Please refer to the screenshot below: 

Survey123 Offline

2. Change the hosted feature layer settings

At the item details page, navigate to the 'Settings' button at the main header and click it. This will prompt open the settings page for the feature layer. Refer to the screenshot below:

At the 'Settings' page, there are two tabs at the subheader; 'General' and 'Feature layer (hosted)'. Click 'Feature layer (hosted)' to configure its settings.

At the 'Feature layer (hosted)' option, locate the 'Editing' section. Here, check the 'Enable sync' option. This is the option that will enable offline data editing. Please refer to the following screenshot: 

Don't forget to click 'Save'

Survey123 Offline
Survey123 Offline

With this, your hosted feature layer which serves as the data model is enabled for synchronization. Synchronization helps to sync back any changes you've made when you're out on the field collecting data; editing, adding, deleting or update...depending on what feature editing you've configured. 

It's pretty easy once you get the hang of it and just bear in mind that the data hierarchy in the ArcGIS Online universe are as follows:

Feature layer (hosted) > Web map > Web application

Once you get that out of the way, go crazy with your data collection without any worries!

More Posts from Azaleakamellia and Others

2 years ago
In ArcGIS Pro, The Erase Tool Only Comes With The Advanced License. There Are Other Ways To Go About

In ArcGIS Pro, the Erase tool only comes with the Advanced license. There are other ways to go about removing parts of a polygon/line data layer like the Clip tool. But Union is that tool where it makes more sense by principle.

It works by marking overlapping parts of two different data layer with integers; 1, 2 and so forth. Those that do not overlap is universally -1. So, remove everything else that you want out of the picture by deleting output features that contain FID integer values of more than -1! Simple eh?

Check out the <3 minutes demo below!

P/S: Happy New Year peeps! ♥


Tags
1 year ago
🌱 Google Earth Engine 101

🌱 Google Earth Engine 101

Uploading a shapefile as an asset in GEE and making use of it

🟢 Beginner-friendly.

🆓 Free with no hidden monetary cost.

🤚🏻 Requires registration so sign-up 👉🏻https://signup.earthengine.google.com/, access via browser and Internet connection

🖥️ Available for Windows, Mac and Linux.

Google Earth Engine or lovingly called GEE is another free and open platform provided by Google to provide a very vast and comprehensive collection of earth observation data. Since Sentinel-2 is no longer available for download at USGS Earth Explorer, I find the alternative too challenging for me so GEE seems like the easiest way to go. If you're looking for a one-stop platform to access satellite imagery for free, GEE is a great place to start. You don't have to learn JavaScript explicitly to start using this tool.


Tags
1 year ago

Malaysia Bid Round 2023 (MBR 2023)

Malaysia Bid Round 2023 (MBR 2023)

Tool: ArcGIS Pro 2.6.3 Technique: Symbolization, labeling and SQL expression

MBR 2023 is a peak event that culminates all the effort of data collection and stock take of hydrocarbon resource in the Malaysia. It is an annual event that put together all the exploration blocks, discoverable hydrocarbon fields and late life assets for upstream sectors to evaluate and invest in.

Malaysia Bid Round 2023 (MBR 2023)

Leading up to the event, the Malaysia Petroleum Management (MPM) updates, re-evaluate and produces maps; static and digital, to cater to the need for the most update stock-take of information that can be gained from various source of exploration output; seismic, full tensor gradiometry, assets; cables, pipelines, platforms, as well as discoverable resources. This year's them aims to include various prospects and initiative to align the industry itself with lower carbon emission and to explore the option for carbon capture storage (CCS) attempts in the popular basins such as the Malay and Penyu Basin. This is a big follow-up with the closing of MBR 2022 with the PSC signing for 9 blocks a few days earlier.

Malaysia Bid Round 2023 (MBR 2023)

Credit: Sh Shahira Wafa Syed Khairulmunir Wafa

Over ~70 maps for unique blocks have been produced during the finalization stage, ~210 maps during data evaluation and additional 20 for the event. And this excludes the standardized maps to formalize information requested by prospective bidders as well as clients who are facing prospects of extending their contract.

The standardization of the map requires the optimization of workflow and standard templates to cater to rapid changes and exporting to rapid output.

For more information on the event, please access the following resources:

PETRONAS: Malaysia Bid Round

PETRONAS myPROdata

The Malaysian Reserve: Petronas offers 10 exploration blocks in MBR 2023


Tags
4 years ago

diy geo app

Esri has been releasing more and more MOOC over the span of 2 years to accommodate its increasingly large expanse of products within the ArcGIS ecosystem. 

But of all the MOOCs that I've participated in, 'Do-It-Yourself Geo App MOOC' must be the most underrated ones produced by Esri Training. The functionalities highlighted within the MOOC took the anthem right off their recent Esri UC 2020 that went virtual. The curriculum includes:

The creation of hosted feature layer (without utilizing any GIS software medium like ArcMap or ArcGIS Pro).

The basics of the ArcGIS Online platform ecosystem:

hosted feature layer >  web map > web app 

Basically, to view a hosted feature layer, you will need to drag it onto a 'Map' and save it as a web map.

Conventionally, web map suffices for the visualization and analytical work for the likes of any geospatialist who are familiar with Web GIS. 

But this time, Esri is highlighting a brand new web map product called 'Map Viewer Beta'. Why beta? Cause it is still in beta version but so sleeky cool that they just had to let every have a shot at using it. Truth be told, Map Viewer Beta did not disappoint.

Even so, Map Viewer Beta still has some functionalities that have yet to be implemented. 

Using web map to visualize data, configure pop-up, execute simple analysis and extending it to Map Viewer Beta interface 

Utilizing Survey123 for crowdsourcing data; the first level of citizen science and creating a webmap out of it.

Creating native apps using AppStudio for ArcGIS; no coding required. 

Some tidbits on accessing the ArcGIS API for JavaScript

I love how cool it is that this MOOC actually shows you step-by-step on how to use the new Map Viewer Beta and explain the hierarchy of formats for the published content in the ArcGIS Online platform

I have established my understanding of ArcGIS Online ecosystem 3 years back but I do find it awkward that such powerful information is not actually summarized in a way that is comprehensible for users that have every intention of delving into Web GIS. And Web GIS is the future with all the parallel servers that could handle the processing/analysis of large amount of data. ArcGIS Online is a simplified platform that provides interfaces for the fresh-eyed new geospatial professionals. 

It is quite well-know for the fact that there has been some criticism as to the domination of Esri within the GIS tools/resources within the geospatial science industry, but I believe it is something we could take as a pinch of salt. Not everything in Esri's massive line of commercial products are superior to other platforms but it is a starting point for any new geospatialists who wants to explore technologies there are not familiar with. 

All in all, this MOOC is heaven-sent. For me, I have been playing with the web apps and web maps for close to 4 years and I can attest to the fact that it covers all the basics. For the developer's bit, maybe not so much as going through it in a distinct step-by-step but it does stoke the curiosity as to how it works. The question is, how do we make it work. Now that's a mystery I am eager to solve. 

I'm going to put this on my ever-expanding to-do list and think JavaScript for another few more months of testing out this ArcGIS API for JavaScript implementation. Tell me if you wanna know how this actually works and I'll share what I find out when I do.

For those who had missed out on this cohort, fear not. This MOOC runs twice a year and the next cohort is going to be from Feb 17 to March 17 2021. The registration is already open, so don’t hold back and click the link below:

Do-It-Yourself Geo Apps

Do register for a public account before signing up or just click 'Register' at the MOOC's page and it's open the open to either sign in or 'Create a public account'. It was a blast and I'm sure, if you've never used any of the feature I've mentioned above, you'll be as wide-eyed as I was 3 years ago. :D

Till then, stay spatially mappy comrades!

P/S: If you complete all the assignments and quizzes, you'll get a certificate of completion from Esri. Which is pretty rad!


Tags
4 years ago

code & run

Code & Run

Coding is one of the things I have aspired to do since like...forever! But finding a resource in-sync with my comprehension, schedule and able to retain my interest long enough is a challenge.

I have the attention span of a gnat so, I jumped everywhere! If I am not actively engaged with the learning, I just can't do it. And I know...we have DataCamp, Udemy, Khan Academy and even Kaggle...but I either can't keep up, too poor to pay for the full course or it couldn't sync with me enough. I believe I can say that most of the exercise doesn't 'vibe' with me.

Recently, I committed myself to my one passion; running. It's one of my favorite activities when I was back in school but the will to really run died a decade ago. I have recently picked up my running shoes and ran my little heart out despite having the speed of a running ant; aging perhaps? And I owe my hardcore will to the motivation of earning what I paid when I decided to join a 1-month long virtual run of 65km. It is called the 'Pave Your Path' virtual run organized by

Running Station

. Nailed it 2 days ago after 13 sessions of 5km - yes, you can accumulate the distance from multiple runs. It made me realize that...it's not that bad. The 'near-death' experience while running kinda turned me into a daredevil these days when it comes to undertaking some things I'd whine about doing a few months back.

"If I can go through dying every single evening for 5km long run...I can handle this,"

My thoughts exactly every time I feel so reluctant to finish some tasks I believe I could hold off for some time.

Naturally, I plan my work rigorously and despite the flexibility of my schedule and my detailed plans, I still have a hard time trying to nail the last coffin to my projects. Usually, it's due to my brain's exhaustion from overthinking or I am just truly tired physically. Which is a weird situation given I do not farm for a living. Even so, I was lethargic all the time.

But when I started running a month ago, things kind of fall into places for me. Maybe...just maybe...I've become more alert than I used to. I still have my ignorance of things that I believe do not concern my immediate attention but I seem to be able to network my thoughts faster than I used to.

It might be just me, feeling like a new person due to my sheer willpower to not burn my RM60 paid for the virtual run, but it did feel like there was a change.

For that, I managed to confirm what I have suspected all along - I am one of those people who love drills. I like things to be drilled into my head until I by-heart it into efficiency and then focus on polishing the effectiveness.

Thus...for coding, I committed myself to

freeCodeCamp

. By hook or by crook, I'll be coding by first quarter next year or someone's head is gonna roll!

It's an interactive learning experience simple enough for me to start, straightforward enough to not make me waste my time searching for answers and it's free. God bless Quincy Larson.

Going back to the program outlined in freeCodeCamp, I find it fascinating that they start off with HTML. I have no arguments there. My impatience made me learn my lesson - you run too fast, you're going to burn out painfully and drop dead before you halfway through. HTML is a very gentle introduction to coding for newbies since it's like LEGO building blocks where you arrange blocks and match two to create something. I didn't have to go crazy with frustration is I don't 'get' it. Yes, we would all want some Python lovin' and I think alot of coders I came to know have raved about how simple it is to learn. But I think, it is an opinion shared by 'experienced' coders who wished Python was there when they first started coding. Someone once told me, what you think is the best based on others' experiences may not be the best for you...and I agree with this. After alot of deliberations and patience at my end, starting over again this time feels, unlike the dreaded looming doom I've always had back then.

Are you into coding? What do you code and what's you're language preference? Where did you learn coding? Feel free to share with me!


Tags
3 years ago

Python: Geospatial Environment Setup (Part 2)

Python: Geospatial Environment Setup (Part 2)

Python: Geospatial Environment Setup (Part 2)

Hey again folks! I am here for the second part of Python environmental setup for a geospatial workspace. I published the first part of this post two weeks ago. So if you've not yet read that, I'll catch you up to speed with our checklist:

Install Python ☑

Install Miniconda ☑

Install the basic Python libraries ☑

Create a new environment for your workspace

Install geospatial Python libraries

🗃 Create a new environment for your workspace

Since we have actually manually set up our base environment quite thoroughly with all the basic libraries needed, to make our work easier, we can just clone the base environment and install all the additional essential libraries needed for geospatial analysis. This new environment will be called geopy. Feel free to use a name you identify most with.

Why don't we just create a new environment? Well, it means we have to start installing the Python libraries again from scratch. Although it is no trouble to do so, we want to avoid installing so many libraries all at once. As I mentioned in Part 1, there is always a risk where incomplete dependencies in one library will affect the installation of other libraries that you intend to install in one go. Since we already have a stable and usable base environment, we can proceed to use it as a sort of pre-made skeleton that we will build our geospatial workspace with.

1️⃣ At the Anaconda Command Prompt, type the following:

Python: Geospatial Environment Setup (Part 2)

2️⃣ Press Enter and the environment will be clone for you. Once it is done, you can use the following command to check the availability of your environment 👇🏻

Python: Geospatial Environment Setup (Part 2)

You should be able to see your geopy environment listed along with the base environment.

👩🏻‍💻 Install geospatial Python libraries

Here we will proceed with the installation of a few geospatial Python libraries that are essential to reading and exploring the vectors and rasters.

🔺 fiona: This library is the core that some of the more updated libraries depend on. It is a simple and straightforward library that reads and writes spatial data in the common Python IOs without relying on the infamous GDAL's OGR classes.

🔺 shapely: shapely library features the capability to manipulate and edit spatial vector data in the planar geometric plane. It is one of the core libraries that recent geospatial Python libraries rely on to enable the reading and editing of vector data.

🔺 pyproj: is the Python interface for the cartographic projections and coordinate system libraries. Another main library that enables the 'location' characteristics in your spatial data to be read.

🔺 rasterio: reads and writes raster formats and provides a Python API based on Numpy N-dimensional arrays and GeoJSON.

🔺 geopandas: extends the pandas library to allow spatial operations on the geometric spatial data i.e shapefiles.

💀 As you might have noticed, we won't be doing any direct gdal library installation. It's mainly due to the fact that its installation is a process that seems to be accompanied by misery at every turn and involved workarounds that are pretty inconsistent for different individuals. Does it mean that we won't be using it for our Pythonic geospatial analysis? Heck no. But we will be taking advantage of the automatic dependency installation that comes with all the libraries above. The rasterio library depends on gdal and by installing it, we integrate the gdal library indirectly into our geospatial environment. I found that this method is the most fool-proof. Let's proceed to the installation of these libraries.

1️⃣ At the Anaconda Command Prompt, should you start from the beginning, ensure that your geopy environment is activated. If not, proceed to use the following command to activate geopy.

Python: Geospatial Environment Setup (Part 2)

Once activated, we can install the libraries mentioned one after another. Nevertheless, you also have the option of installing them in one go directly using a single command 👇🏻

Python: Geospatial Environment Setup (Part 2)

💀 geopandas is not included in this line-up NOT because we do not need it. It's another temperamental library that I prefer to isolate and install individually. If gdal is a rabid dog...then geopandas is a feral cat. You never know how-when-why it doesn't like you and forces a single 10-minute installation drag to hours.

3️⃣ Once you're done with installing the first line-up above, proceed with our feral cat below 👇🏻

Python: Geospatial Environment Setup (Part 2)

4️⃣ Use the conda list command again to check if all the libraries have been installed successfully.

🎉Et voilá! Tahniah! You did it!🎉

🎯 The Jupyter Notebook

It should be the end of the road for the helluva task of creating the geospatial environment. But you're going to ask how to start using it anyway. To access this libraries and start analyzing, we can easily use the simple and straight-forward Jupyter Notebook. There are so many IDE choices out there but for data analysis, Jupyter Notebook suffices for me so far and if you are not familiar with Markdown, this tool will ease you into it slowly.

Jupyter Notebook can be installed in your geopy environment as follows:

Python: Geospatial Environment Setup (Part 2)

And proceed to use it by prompting it open via the command prompt

Python: Geospatial Environment Setup (Part 2)

It ain't that bad, right? If you're still having problems with the steps, do check out the real-time video I created to demonstrate the installation. And feel free to share with us what sort of problems you have encountered and the workaround or solutions you implemented! It's almost never a straight line with this, trust me. As mentioned in the previous post, check out the quick demo below 👇🏻

youtu.be
Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

See you guys again for another session on geospatial Python soon!


Tags
4 years ago

wildlife study design & analysis

Wildlife Study Design & Analysis

To cater for my lack of knowledge in biological data sampling and analysis, I actually signed up for the 'Wildlife Study Design and Data Analysis' organized by Biodiversity Conservation Society Sarawak

So, this new year, I've decided to take it down a notch and systematically choose my battlefield. Wildlife species data has always been mystery at me. As we all know, biologists hold them close to their hearts to the point of annoyance sometimes (those movies with scientists blindly running after some rare orchids or snakes or something like that really wasn't kidding). Hey...I get it and I totally agree - the data that belongs to the organization has to be treated with utmost confidentiality and all by the experts that collects them. Especially since we all know that they are not something so easily retrieved. Even more so, I optimistically support for the enthusiasm to be extended to their data cleaning and storing too while they're at it. But it doesn't mean I have to like the repercussions. Especially not when someone expects a habitat suitability map from me and I have no data to work with and all I had is a ping-pong game of exchanging jargon in the air with the hopes that the other player gets what you mean cough up something you can work with. Yes...there is not a shred of shame here when I talk about how things work in the world, but it is what it is and I'm not mad. It's just how it works in the challenging world of academics and research. 

To cater for my lack of knowledge in biological data sampling and analysis, I actually signed up for the 'Wildlife Study Design and Data Analysis' organized by

Biodiversity Conservation Society Sarawak (BCSS for short)

or

Pertubuhan Biodiversiti Konservasi Sarawak

It just ended yesterday and I can't say I did not cry internally. From pain and gratitude and accomplishment of the sort. 10 days of driving back and forth between the city center and UNIMAS was worth the traffic shennanigans.  

It is one of those workshops where you really do get down to the nitty-gritty part of understanding probability distribution from scratch; how to use it for your wildlife study data sampling design and analyzing them to obtain species abundance, occupancy or survival. And most importantly, how Bayes has got anything to do with it. I've been hearing and seeing Bayesian stats, methods and network on almost anything that involves data science, R and spatial stats that I am quite piffed that I did not understand a thing. I am happy to inform that now, I do. Suffice to say that it was a bootcamp well-deserved of the 'limited seats' reputation and the certificate really does feel like receiving a degree. It dwindles down to me realizing a few things I don't know:

I did not know that we have been comparing probabilities instead of generating a 'combined' one based on a previous study all these years.

I did not know that Ronald Fisher had such strong influence that he could ban the usage of Bayesian inference by deeming it unscientific.

I did not know that, for Fisher, if the observation cannot be repeated many times and is uncertain, then, the probability cannot be determined - which is crazy! You can't expect to shoot virus into people many times and see them die to generate probability that it is deadly!

I did not know that Bayes theorem actually combines prior probability and the likelihood data you collected on the field for your current study to generate the posterior probability distribution!

I did not know that Thomas Bayes was a pastor and his theory was so opposed to during his time. It was only after Ronald Fisher died that Bayesian inference gain favor especially in medical field. 

I did not know...well...almost anything at all about statistics!

It changed the way I look at statistics basically. But I self-taught myself into statistics for close to 9 years and of course I get it wrong most of the time; now I realize that for the umpph-th time. And for that, I hope the statistics power that be forgives me. Since this boot camp was so effective, I believe it is due to their effort in developing and executing the activities that demonstrates what probability distribution models we were observing. In fact, I wrote down the activities next to the topic just to remember what the deal was. Some of the stuffs covered are basics on Binomial Distribution, Poisson Distribution, Normal/Gaussian Distribution, Posterior probability, Maximum Likelihood Estimate (MLE), AIC, BACI, SECR, Occupancy and Survival probability. Yes...exhausting and I have to say, it wasn't easy. I could listen and distracted by paper falling for a fraction of time just to find myself lost in the barrage of information. What saved me was the fact that we have quizzes that we have to fill in to evaluate our understanding of the topic for the day and discuss them first thing in the next session. Best of all, we were using R with the following packages: wiqid, unmarked, rjags and rasters. Best locations for camera traps installation was discussed as well and all possible circumstances of your data; management and collection itself on the field, were covered rigorously. 

For any of you guys out there who are doing wildlife study, I believe that this boot camp contains quintessential information for you to understand to design your study better. Because once the data is produced, all we can do it dance around finding justification of some common pitfalls that we could've countered quite easily. 

In conclusion, not only that this workshop cast data analysis in a new light for me, but it also helps establishes the correct steps and enunciates the requirements to gain most out of your data. And in my case, it has not only let me understand what could be going on with my pals who go out into the jungle to observe the wildlife first hand, it has also given me ideas on looking for the resources that implements Bayesian statistics/methods on remote sensing and GI in general. Eventhough location analysis was not discussed beyond placing the locations of observation and occasions on the map, I am optimistic in further expanding what I understood into some of the stuff I'm planning; habitat suitability modeling and how to not start image classification from scratch...every single time if that's even possible. 

For more information on more workshops by BCSS or wildlife study design and the tools involved, check out the links below:

Biodiversity Conservation Society Sarawak (BCSS) homepage: https://bcss.org.my/index.htm

BCSS statistical tutorials: https://bcss.org.my/tut/

Mike Meredith's home page: http://mikemeredith.net/

And do check out some of these cool websites that I have referred to for more information as well as practice. Just to keep those brain muscles in loop with these 'new' concepts:

Statistical Rethinking: A Bayesian Course with Examples in R and Stan: https://github.com/rmcelreath/statrethinking_winter2019

Probability Concepts Explained: Introduction by Jonny Brooks-Bartlett: https://towardsdatascience.com/probability-concepts-explained-introduction-a7c0316de465 

Probability Concepts Explained: Maximum Likelihood Estimation by Jonny Brooks-Bartlett: https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1

Probability Concepts Explained: Bayesian Inference for Parameter Estimation by Jonny Brooks-Bartlett 

I'll be posting some of the things I am working on while utilizing the Bayesian stats. I'd love to see yours too!

P/S: Some people prefer to use base R with its simple interface, but if you're the type who works better with everything within your focal-view, I suggest you install RStudio. It's an IDE for R that helps to ease the 'anxiety' of using base R. 

P/S/S: Oh! Oh! This is the most important part of all. If you're using ArcGIS Pro like I do, did you know that it has R-Bridge that can enable the accessibility of R workspace in ArcGIS Pro? Supercool right?! If you want to know more on how to do that, check out this short 2 hour course on how to get the extension in and an example on how to use it: 

Using the R-Bridge: https://www.esri.com/training/catalog/58b5e417b89b7e000d8bfe45/using-the-r-arcgis-bridge/


Tags
1 year ago
🌱 Google Earth Engine 101

🌱 Google Earth Engine 101

Viewing, filtering and downloading Landsat 9 satellite image

🟢 Beginner-friendly.

🆓 Free with no hidden monetary cost.

🤚🏻 Requires registration so sign-up 👉🏻https://signup.earthengine.google.com/, access via browser and Internet connection

🖥️ Available for Windows, Mac and Linux.

If you're down the rabbit hole with the how-to documents on #googleearthengine and still want to find a simple basic code to start with, feel free to check out this 16-minute video I've made for myself to remember some of the syntaxes. There's no better way to understand it than doing it yourself so, I've also included links to the script app and code in GitHub 👇🏻

⌨️ Code: https://github.com/hrllscrt/gee/tree/...

🌏 GEE sandbox: https://code.earthengine.google.com/a...

The script demonstrates how to utilize readily available data in the Earth Engine data repository like LSIB 2017 for demarcation and I've taken the liberty to play around with Landsat 9. The code should be applicable for Landsat 8 too, for those who wants to use it to extract data straight away.

Don't be shy to share your questions if you have any and have fun dissecting the data!


Tags
Loading...
End of content
No more pages to load
  • nurefarina
    nurefarina liked this · 4 years ago
  • nurefarina
    nurefarina reblogged this · 4 years ago
  • azaleakamellia
    azaleakamellia reblogged this · 4 years ago
azaleakamellia - anecdata
anecdata

#gischat #eo #running #simblr #cartokantoi

45 posts

Explore Tumblr Blog
Search Through Tumblr Tags