Web map of Canada's Federal Election from 2000 to 2015

ESRI has created a web map of Canada's Federal Election from 2000 to 2015. You can click on the map to view the election results and graphs of any Federal Electoral Districts. You can change the election year by clicking the arrow on the bottom of the map. The legend's description can be viewed by clicking the info button (i) on the top right corner. It is a useful web map that shows Federal Election results in Canada from 2000 to 2015.

Web Map displaying Satellites in Earth's Orbit

ESRI has developed a web map using ArcGIS API for JavaScript 4.0 that shows Satellites in earth's orbit. You can click on the Satellites (dots) to view its description. It is a 3D interactive web mapping application. It's great fun to play around with this application.

Mapping Liquid Water on Mars

Since NASA has recently announced the discovery of water on Mars, ESRI has created an interactive story map that combines high resolution satellite imagery, craters and animation videos to show the evidence of liquid water in Mars.

ESRI's Interactive story Map showing Liquid water on Mars

Virtual trip to Mars using NASA's Mars Trek



NASA has launched Mars Trek - an interactive Map viewer, that can be used to explore Mars, just like the way you explore earth on google earth! You can take a quick tour to familiarize yourself with the Map viewer. You can even view the surface in 3D! If you think it is scary to sign up for a one-way trip to Mars, count your lucky stars, you can now explore Mars with few clicks from our very own planet - the earth! 

Click on the link below to access NASA's Mars Trek:
NASA's Map Trek - an interactive Map viewer

Python code to export layers on Table of contents of mxd to pdf



# Name: layerstopdf.py
# Date: January 15, 2014
# Author: Sarbajit Gurung
# Description: Export each layer from Table of contents in ArcMap to a pdf document
# ---------------------------------------------------------------------------

# Import arcpy module
import arcpy

# if pdfs need dates attached to their names then use it, otherwise it can be excluded
import datetime
today = datetime.date.today()
yesterday = today - datetime.timedelta(days=1)

# Define mxd
mxd = arcpy.mapping.MapDocument(r"C:\input\test.mxd")

#set export path to export pdfs
path = r"C:\output\Maps\\"

# turn each layer on and off and export as pdf
for lyr in arcpy.mapping.ListLayers(mxd):
    if lyr.name == "LayerA":
        lyr.visible = True
        arcpy.RefreshTOC()
        arcpy.RefreshActiveView()
        arcpy.mapping.ExportToPDF(mxd, path + "LayerA_" + yesterday.strftime('%d%b%Y') + ".pdf")
        if lyr.name == "LayerA":
            lyr.visible = False
            arcpy.RefreshTOC()
            arcpy.RefreshActiveView()

    if lyr.name == "LayerB":
        lyr.visible = True
        arcpy.RefreshTOC()
        arcpy.RefreshActiveView()
        arcpy.mapping.ExportToPDF(mxd, path + "LayerB_" + yesterday.strftime('%d%b%Y') +".pdf")
        if lyr.name == "LayerB":
            lyr.visible = False
            arcpy.RefreshTOC()
            arcpy.RefreshActiveView()

# so on and so forth for other Layers on your mxd
# you can also store the layer names in an array and create a loop if you have many layers on TOC

# Release mxd
del mxd

Python code to export mxd to pdf with date value on the pdf



 

# Name: Exporttopdf.py
# Date: February 11, 2015
# Author: SGurung
# Description: This code creates pdf map from mxd file. The pdf file can have date value attached on # its name, e.g., "testmap_date.pdf". The code can be scheduled on task scheduler to run it at your
# desired time.
# ---------------------------------------------------------------------------

# import arcpy module
import arcpy

# import datetime module. Date value can be today, yesterday or as defined by user
import datetime
today = datetime.date.today()
yesterday = today - datetime.timedelta(days=1)

# Overwrite output file if it already exist
arcpy.OverWriteOutput =True

# Define the location of mxd
mxd = arcpy.mapping.MapDocument(r"C:\input\test.mxd")

# Define the output location and then date value as needed to be displayed on the pdf file
# %d = day of the month, %b = abbreviated month name, %y = year
pdf = r"C:\output\Maps\\" + "testmap_" + yesterday.strftime('%d%b%Y') + ".pdf"

# Set pdf parameters as per your requirement here:
data_frame = 'PAGE_LAYOUT'
resolution = "300"
image_quality = "NORMAL"
colorspace = "RGB"
compress_vectors = "True"
image_compression = "DEFLATE"
picture_symbol = 'RASTERIZE_BITMAP'
convert_markers = "False"
embed_fonts = "True"
layers_attributes = "NONE"
georef_info = "False"

# Export mxd to pdf
arcpy.mapping.ExportToPDF(mxd, pdf, data_frame, 640, 480, resolution, image_quality, colorspace, compress_vectors, image_compression, picture_symbol, convert_markers, embed_fonts, layers_attributes, georef_info)

# Release mxd
del mxd

Geoportal about earthquakes and relief efforts in Nepal


ICIMOD (International Centre for Integrated Mountain Development) did an amazing job in creating a geoportal that provides interactive maps, charts, and infographics about the earthquakes and relief efforts in Nepal. ICIMOD was honored with the ESRI Humanitarian Award in ESRI User Conference 2015 for their role in assisting Nepalese government with GIS technology. 

The key features of the geoportal are as followings:
  • interactive maps and charts with facts and figures at national, district and municipality levels
  • facts and figures under different thematic areas such as geohazards, infrastructure, education, landslides, agriculture etc.
  • swipe visualization of high resolution satellite imagery to view before and after earthquake effect
  • story map journals of affected districts providing the status of casualties, damage, and response efforts
  • information on geo-hazards, including field data on landslides with 3D visualization
  • incident reporting for disaster events in near real-time

The geoportal is very responsive and the rendering of the layers is smooth. The graphs are clearly laid out. The home page is a collection of different thematic Map viewers, such as 'Country profile', 'Landslide assessment', 'Geo-hazard', 'Swipe map', 'Incident reporting' etcetera. The Map viewers are extremely user friendly. However, the geoportal would have been better with addition of basic Map viewer tools such as measuring, simple querying, exporting attribute data and printing tools. Having said that, the geoportal reveals how GIS technology can be effectively used for natural disaster management.  

Nepal Earthquake 2015 Web Map

Nepal was struck by a devastating earthquake on 25th April 2015. A major aftershock occurred on 12th May 2015. Here is a web map of earthquake in Nepal that contains layer such as photo log, earthquake incidents, damage assessment and census data symbolized based on total population at municipality and village development committee (VDC) level. You can click on any points and view the attribute information. Click on the arrow on top left to view the legends of the layers. Zoom in and out can be performed by clicking + or - button on the left corner. The home icon takes you to the full extent view. Search of address and places can be performed by using the search box located at the top right. ArcGIS online is used to make this web map. The web map services published on ArcGIS online are added as layers. A lot of customizations have been made on the layers to create this web map. Please note that some locations are not geo-referenced properly.

View larger map

Other web maps related to Nepal Earthquake 2015
Nepal Quake Unofficial Crisis Map on Google Map
Nepal Earthquake Google Crisis Map (contains Satellite Imagery of the area)

Processing Landsat 8 images with SAC plugin for QGIS

Here is an excellent video that explains how Landsat 8 images can be downloaded and processed by using SAC (Semi-Automatic Classification) plugin for QGIS. It is an awesome tool that allows users to search through the Landsat database and use filters such as set date, select cloud cover precentage, define area of interest etc. Learn more about SAC plugin for QGIS.


Introduction to GIS

I found this excellent resource on youtube that explains the "Introduction of GIS" so lucidly. I was trying to dig up some resources on QGIS, an Open Source GIS software. However, I stumbled upon this splendid video that describes what GIS is all about in a nutshell. This video is presented by Jere Flogert from knowgis.com.




Web map of Live Weather Data in USA and Canada

Web map of Live Weather Data in USA and Canada. This map pulls out several web map services from the web and overlays them, a term also known as "creating mashup". You can click on the double arrow on the left corner to view the legend. The attributes of the weather stations within Canada can be viewed by clicking on the weather station location. For USA, zoom in further to your desired area (click + buttton on top left corner above home icon) until you see the weather stations icon and simply click on that weather station. 

View larger map

Indian GeoPortal Bhuvan


Free GIS data, free access to high resolution satellite imagery, interactive web map viewer, access to land services, weather services layers, 2D and 3D views and so many stuffs on Indian GeoPortal Bhuvan. The web map viewer, though still in beta version, is interactive and user friendly. It's compatible with most browsers. I like Bhuvan! 

Here is the link to Bhuvan home page

Here is the link to the Map Viewer

Here is an article on Bhuvan posted on GIS Lounge

Earthquake mapping in real time on OregonLive


There are plenty of earthquake mapping websites such as Seismic Monitor, the U.S. Geologic Survey, and the Pacific Northwest Seismic Network etcetera to name a few. These websites show recent earthquakes across the globe, however, the Oregonian/OregonLive earthquake web map makes underwater features a major component, thereby focussing on tsunami waves triggered by the earthquakes. The Oregonian/OregonLive web map is updated every 15 minutes and makes it easier to track earthquakes in oceans. 

Read more on OregonLive



Web Map of Median Household Income in Canada

Here is the Web Map of Median Household Income in Canada 2006. This web map was created in ArcGIS online and embedded in this website. The enhanced features now have the ability to include legend and layer details (click on the double forward arrow at the top left corner). The search box is on the top right corner. To obtain the attribute information, just click on the desired area on the map.


View larger map

ESRI user conference in Calgary

This is the fifth year that I've consecutively attended the ESRI User Conference in Calgary. I have always enjoyed attending this conference. It has helped me to keep up with the current GIS technology. Here are some of the pictures taken during the Conference held on 27th May 2015. 






utsav


Here is a copy of UTSAV magazine that we (Nepalese at The University of Calgary) created in 2010. I wrote an article called "Reminiscing Karmidanda" on page 7. It is a shame that we were unable to publish further issues of this magazine, nevertheless, the one and only publication of UTSAV was a combined effort of all the Nepalese students at U of C in 2010.


Wildfires in Alberta

Wildfires in Alberta - 66 burning across the province, 20 out of control, cites Calgary Herald! Just over a month after Nepal got hit by a devastating 7.9 Richter scale earthquake, and recently Texas being inundated with torrential rainfall, natural disaster has not spared Alberta either!


The situation is worse in northern Alberta compared to the south, where the situation is put under control by the wildland firefighters. News on Calgary Herald

Dominion Land Survey System in Western Canada

Here is a very helpful information about DLS System in western Canada.

The document below, written by Chris Rule, clearly explains how DLS system in western Canada works. I came across this document at work as I was assigned to see if a particular legal land description in Saskatchewan exists or not. This document was very insightful!
DLS in Western Canada

Cost effectiveness of Landsat data

USGS states that the economic value of just one year of Landsat data far exceeds the multi-year total cost of building, launching, and managing Landsat satellites and sensors! Landsat has been there for more than 42 years. We've been using Landsat images for various analysis. The efficacy of Landsat data in terms of cost effectiveness is massive! 



Read complete article here

Maps of Earthquakes in Nepal 2015

Maps of Earthquakes in Nepal 2015




Using drones to help earthquake victims in Nepal



To assist earthquake victims in Nepal, Canadian relief team have used drones to capture high-resolution aerial images and videos to locate areas where aid is required so that emergency responders can act efficiently. The Canadian made drones, used by the Toronto based humanitarian organization called GlobalMedic, have taken thousands of high-resolution images that are cross-stitched onto maps. It is amazing to see how GIS based technologies are being used for natural disaster relief causes. Read more on CBC news.

Modelling the dispersion of vehicular carbon monoxide pollution along Deerfoot Trail in Calgary


Abstract (MGIS Thesis)

The main objective of this study is to model the dispersion pattern of vehicular carbon monoxide along Deerfoot Trail in Calgary by using CALINE4 software combined with GIS techniques. A road network (14.28 km long), extending from south-east to north-east Deerfoot Trail, is considered as the road segment under study. The study is broadly divided into three parts viz. CALINE4 analysis, GIS analysis and S-PLUS analysis. The CALINE4 analysis involves two different run type models to calculate 1-hour average CO concentrations at the receptors under ‘Standard’ run type model and then 8-hour average CO concentration under ‘Multi-Run/WorstCase hybrid’ model. A typical day (June 1, 2006) is chosen to calculate 1-hour average CO concentration at the receptors from 7:00 am to 6:00pm. The predicted CO is used to evaluate the model by comparing to the observed values of CO at south-east monitoring station. The output from standard run type model shows that the predicted CO concentration is mainly dependent on wind speed and wind direction. The R-square value of 0.47 suggests that the model explained 47% of the original variability in the data. The predicted 8-hour average CO concentration from 7:00 am to 2:00 pm is used for interpolating the surface to predict the values of CO at unknown locations within the study area. The predicted CO concentrations at the receptors are interpolated using Inverse Distance Weighting (IDW), Global Polynomial (GP), Ordinary Kriging (J-Bessel model) and Universal Kriging (Spherical model) interpolation methods. The Oridnary Kriging model is statistically more significant than the Universal Kriging model with lower Root Mean Square (RMS) value of 0.32 and higher RMS standardized value of 0.92. However, the Trend Surface Analysis shows the presence of first order trend in the data and the Geometric Anisotropy shows the presence of anisotropy in the data. Therefore, Ordinary Kriging model uses a biased estimator and is not reliable. The Universal Kriging model accounts for both trend and anisotropy in the data. As such, it is the best model that can be used reliably for decision-making purposes. This model has RMS and RMS standardized values of 0.33 and 0.87 respectively. The S-PLUS analysis is used to validate the statistical significance of the GIS analysis and perform Universal kriging. The Universal kriging model shows CO concentration ‘hot-spots’ especially around the areas close to the highway.

Read entire thesis

Poster Presentation of MGIS Thesis (awarded as the best MGIS Poster from The Department of Geography at University of Calgary in 2010)

GIS Habitat modelling of Bufo Calamita on Ainsdale Sand Dunes National Nature Reserve


Abstract (M.Sc. Thesis)

The main objective of this study is to model the habitats of Bufo Calamita in Ainsdale Sand Dunes National Nature Reserve (North-west England) by using GIS habitat modelling techniques. There are various GIS techniques used in this study viz. Multi-Criteria-Evaluation (MCE) analysis, Neighbourhood analysis, Overlay analysis and 3D views of the models. The results derived from the Neighbourhood analysis is the most effective and has greater potential from management perspective. The neighbourhood analysis results were overlaid with the assigned values of the land cover class in the Overlay analysis to obtain the sites suitable for restoration/preservation. The 3D views of the models were used to give a clear scenario of the models in a 3 dimensional form. The study revealed that GIS techniques are efficient in creating a predictive model of the Bufo Calamita's habitat in Ainsdale Sand Dunes. The habitat modelling of Bufo Calamita is a daunting task as there are several factors/variables to consider that affects their habitat. Moreover, habitat suitability for Bufo Calamita differ between seasons. The evaluation of the model was done by preparing a set of questionnaires to the experts from Ainsdale Sand Dunes regarding the model‟s efficiency. It was acknowledged that the results obtained from this study have a great practical implications especially in prioritizing the potential sites suitable for restoration and preservation, which is crucial in decision making process from the management perspective. Read entire thesis

Advanced Spatial Analysis of Blue-winged Teal habitat in Canadian Prairie Pothole Region


‘The Prairie Pothole Region (PPR) of North America is the primary breeding ground for many ducks; about 80% of the Region is in Canada’ (Batt et al. 1989 cited in Greenwood et al., 1995). ‘The region is characterized by a high density of shallow, productive wetlands that support an abundance of waterfowl and other water birds’ (Kantrud et al., 1989 cited in Austin et al., 2000). ‘The PPR of Canada (Fig. 1) is composed of about 480,000 km2 and spreads across southeastern Alberta, southern Saskatchewan, and southwestern Manitoba with a flat to gently rolling landscape dissected by several rivers’ (Greenwood et al., 1995).

‘The Blue-winged teal is one of the migratory North American waterfowl scientifically known as Anas discors’ (Sandilands 2005). According to Ducks Unlimited Canada (2008), ‘the blue-winged teal ranks fourth in numbers among North American ducks but their numbers have dropped as low as three million or less in recent years from a high of more than 5 million in the 1950s.’ They found out that the Blue-winged teal populations are highly sensitive to drought and removal of their habitat and its conversion by humans to other uses such as agriculture, suburban expansion and road-building. 

The decline of Blue-winged teal with several other migratory ducks in the Canadian PPR is an area of big concern and is the main reason behind carrying out this study. The Bluewinged teal’s habitat is taken into account by considering its population density, which is the main dependent variable, and factors such as precipitation, pond density, conserved soil moisture and land cover suitability are considered as the independent variables. 

This study makes use of secondary data from various websites. The nature and scope of this study is huge and it is almost impossible to capture all the factors that affect the population density of species such as the Blue-winged teal. However, the initiation of this study is crucial to understand the factors contributing to the decline of this species and provides a sound basis for further studies linked broadly with similar objectives set out in this study. Read entire article

Modeling the dispersion of vehicular carbon-monoxide pollution in Kathmandu valley

Modeling the dispersion of vehicular carbon-monoxide pollution in Kathmandu valley, Nepal: A CALINE4 approach combined with GIS Techniques

Abstract

Kathmandu valley is more vulnerable to air pollution than other rapidly growing Asian cities because of the bowl like structure of the valley and poor wind speed inside the valley. The main objective of this study is to model the dispersion pattern of vehicular carbon monoxide in Kathmandu valley by using CALINE4 software combined with GIS techniques. CALINE4 uses vehicular count, pollution and meteorological data to predict the carbon monoxide (CO) concentration. A typical day (15th of February 2007) is chosen to calculate 1-hour average CO concentration at the receptor points during peak hour (8:00 – 9:00 am). A road network extending from Maitighar to Koteshwor, which is approximately 4 km in length, is considered as the main road network. Ninety receptor points are created within the 500 meters buffer area of the main road network and CALINE4 is used to predict CO concentration at these points. The predicted CO concentration at the receptor points are then interpolated using K-Bessel universal kriging. The resulting map is reclassified to create ‘hot-spots’ where the areas are classified based on the predicted CO concentration. Root mean square error (RMSE) method is carried out to evaluate the model performance by comparing the predicted and observed CO concentration within 10 meters buffer from the study site. The RMSE value is found to be 0.77 and the accuracy of the model performance as 74%. Read entire article

Principal Components Analysis and Tasseled Cap Transformation used in Landsat ETM + images



Abstract

This paper describes how principal components analysis and Tasseled Cap Transformation were used in the Landsat ETM+ images of south-east London. The six-dimensional landsat dataset was reduced to three major components where the first component contributed for the maximum proportion of the variance of the original dataset, and subsequent orthogonal components accounted for the maximum proportion of the remaining variance. The first PCA component showed urban settings in the first component, vegetation in the second and water in the third. Similar to the PCA, the TC showed the urban settings in the brightness component, vegetation in greenness and water in the wetness component. As compared to the PCA, TC transformation has a more analytical basis as it combines a generalization from empirical observations. The PCA and TC components were compared to each other which showed that PC1 contained more brightness information than TC1. However, TC2 and TC3 included pertinent information than PC2 and PC3. The colour composites PCA123 and TC123 were visually compared which showed that for this study, TC123 produced better results.

Read entire article (It is password protected. Please contact me for liable access to this article.)







Unsupervised and Supervised Classification of Remotely Sensed Imagery



Introduction

Remote sensing has increasingly been used as a source of information for characterizing land use and land cover change at local, regional and global scales (Townshed and Justice, 2002; Lunetta and Lyons, 2003 in Jensen, 2005, p. 337). Land use/land cover classification based on statistical pattern recognition most often used methods of information extraction (Narumalani et al., 2002 in Jensen, 2005, p. 337). In this study, unsupervised and supervised classifications were carried out for land-cover mapping of a remotely sensed imagery of London.

Objectives of study

● To classify an image using both supervised and unsupervised techniques, including:
i) Evaluation of which Landsat bands to include in a classification
ii)Analysis of signature separability
iii)Class aggregation
iv)Inclusion of ancillary data
● To interpret the result of the classification using accuracy calculations.

Read entire article (It is password protected. Please contact me for liable access to this article.)

Report on Image calculations, Local window filters and Scaling image data


Abstract

This paper describes how band ratioing is useful in providing unique information not available in any single band that is useful for discriminating ground features. It also presents the significance of spatial filtering and rescaling in an image. In this study, four different band ratios were created which helped in extracting useful information regarding the ground features. The colour composite image of the three indices revealed some important information about the ground features and eliminated the effects of shadowing. The ratio with low correlation between the bands, i.e. ratio of red versus nir, was found to contain greater information than the ratios with high correlation between the bands. The NDVI ratio was rescaled by using both the standard scaling algorithm and raster calculator algorithm. A scatter plot was plotted to verify the results were same for both the rescaled NDVI channels. The panchromatic layer containing just band 4 was extracted and median low-pass filter and Sobel edge detector high-pass filter were applied to it. The median filter was found to smooth the image, remove noise and maintain the edges. The Sobel edge filter was found to enhance sharp edges.

Read entire article (It is password protected. Please contact me for liable access to this article.)  


Multi-scale information extraction of land cover from Landsat Imagery and DEM data

Introduction

The main purpose of this study is to perform a multi-scale information extraction of land cover and vegetation structure from Landsat TM imagery and DEM data, using classification and statistical modeling methods. This study also aims to evaluate the results of the analysis through accuracy statistics and meaningful map productions. The data available for this study is a TM Landsat image of the Hinton and Jasper, Alberta area with channels, 1 to 5 and 7. A channel containing a DEM of the area and three additional channels with tasseled cap (TCA) outputs are also provided. 

Additionally, three shape files containing land cover, leaf area index (LAI) and Crown closure are also available. These shape files are in the form of points and are basically used to create training sites and assess the accuracy of the results. The land cover shape file is a file containing 437 land cover calls made by field personnel observing a 90 x 90 meter area roughly equivalent to nine TM pixels. The leaf area index shape file is a file containing 37 estimates of LAI obtained by field personnel using an Accupar Ceptometer over a 30 x 30 meter ground plot roughly equivalent to one TM pixel. The crown closure shape file contains 73 estimates of crown closure measured by field personnel using spherical densiometers over a 30 x 30 meter ground plot roughly equivalent to one TM pixel.

The land cover shape file is used to create a classified image. The LAI and crown closure shape files are used to predict the LAI and Crown closure values for the entire Landsat scene. With the help of these datasets, it is possible to produce multiple maps that can be used to explore the relations hips between LAI, crown closure and the process that are taking place on the ground. The main objective of this study is to produce two maps that can demonstrate how these different sets of data can be used to generate meaningful information at different scales.


Radar Polarimetry in Remote Sensing


Introduction

Radar polarimetry is defined as the science and techniques involved in measuring and analyzing the complex scattering matrix of pixels in a radar image (CCRS 2008). The microwave energy from a radar system includes wavelengths within approximate range of 1cm to 1m and is capable of  penetrating the atmosphere under virtually all conditions depending on the wavelengths involved (Lillesand et. al. 2004). The other advantages of radar over the optical sensors are that radar operates at user-specified times and provides unique information as it senses wavelengths outside the visible portion of the EMR spectrum (Geog 633 2008).

The radar transmits either horizontally polarized (H) or vertically polarized (V) microwave radiation which can then generate a back-scattered wave with a variety of polarizations. Any polarizations on either transmission or reception can be synthesized by using H and V components with a total of four combinations of transmit and receive polarizations (CCRS 2008). In this study we focus on dual-polarized and fully polarized data sets.

For the purpose of this assignment, we have two types of multi-polarized imagery of which one is dual-polarized and the other is polarimetric. The dual polarized data is from Germany taken by ENVISAT-ASAR sensor and is provided with a header file. The polarimetric data set is from an unknown area taken by Convair-580. The main objectives of this study are to explore the differences between multi-polarized imagery using PWS and PCI Geomatica software package, to examine how different polarizations can be used to understand the physical properties of the features, and to understand how the “polarization signature" of targets provides a convenient way of visualizing a target's scattering properties. 


Digital Numbers, Reflectance, and Atmospheric Correction in Remote Sensing


Abstract

This study focuses on the radiometric transformation of two Landsat images (the first is an ETM+ image acquired on September 23, 2001 and the second is a TM image from June 17, 2003) covering portions of the foothills and mountains west of Calgary, Alberta. A total of four radiometric transformation techniques; viz.converting raw DNs to radiance, converting radiance to TOA reflectance, absolute atmospheric correction using PCI‟s atmospheric correction package and atmospheric normalization via relative atmospheric correction using the liner transformation method, were used on the images.The accuracy assessment suggests that for our study site, atmospheric normalization via relative atmospheric correction using the linear transformation method is the most effective with the RMSE value of 9.5955 and absolute atmospheric correction using PCI‟s atmospheric correction package is the least effective with the RMSE value of 15.0371.


Download and Integration of Earth Observing System (EOS) data


Introduction

The Earth Observing System (EOS) instruments are designed to collect data to provide a comprehensive overview of the dynamic components of the Earth’s atmosphere, land and water surfaces (Campbell 2007). ‘The EOS is one of the primary components of a NASA-initiated concept which includes numerous platforms and sensors, including the Terra and Aqua spacecrafts. There are five sensors included on Terra and MODIS (Moderate Resolution Imaging Spectro-Radiometer) is one of them. MODIS is a sensor that is intended to provide comprehensive data about land, ocean, and atmospheric processes simultaneously’ (Lillesandet al.2004). In this study,MODIS Terra vegetation indices are used to create a NDVI map for Alberta province in Canada. The main objectives of the study are to become familiar with the variety of remote sensing data sources accessible through the internet; gain experience in accessing, obtaining and integrating data from disparate online sources; enhance knowledge about different file formats and appropriate conversion and re-projection process; and finally to develop cartographic skills to display the appropriate map message. 


Report on Multiband Images,Colour Compositing, and Contrast Enhancement


Introduction
Image analysis by photo-interpretation is often facilitated when the radiometric nature of the image is enhanced to improve its visual impact. Specific differences in vegetation and soil types, for example, may be brought out by increasing the contrast of an image. Similarly the differences in brightness value can be highlighted either by contrast modification or by assigning quite different colors to those levels (Richards, 1989). This study presents a variety of image enhancement procedures often used with remote sensing image data. 

Objectives of study
i) To explore the basic capabilities of PCI Geomatica software,
ii)To introduce colour composites, histograms, and scatterplots as tools for exploring image data stored in database channels, and
iii)To learn about contrast enhancements, and the impact of the different enhancement types on raw imagery.

Read entire article (It is password protected. Please contact me for liable access to this article.)

Geographically Weighted Regression in Advanced Spatial Analysis and Modeling


The main purpose of this study is to run a Geographically Weighted Regression (GWR) on the City of Calgary Census Data using Geographical Weighted Regression. The data available for study is the Census Tract data set for the City of Calgary provided in a geodatabase format. The dataset is similar to the first and fifth Assignments. However, the previous assignments involved the calculation of a simple linear regression and spatial regression which outputs global parameter rather than local. Therefore, in this study, the objective is to fit a GWR model to the Census data using "Average Income" as the dependent variable so as to allow local parameter to estimate the model. The "Geographically Weighted Regression 3" software package is used to compute the global weighted regression model. This paper includes many statistical techniques applied to obtain the final model. It also compares the results obtained from the linear regression model and spatial regression model  with the Geographically Weighted Regression model.

Spatial Regression in Advanced Spatial analysis and Modeling

Introduction

The main purpose of this study is to explore the process of model selection and spatial regression using S-Plus and its Spatial module. The data available for study is the Census Tract data set for the City of Calgary provided both in GeoDatabase and S-Plus data frame. The dataset is similar to our first Assignment but with an addition of two columns for X and Y to represent Easting and Northing respectively. The previous assignment involved the calculation of a simple linear regression without considering spatial autocorrelation into account. Therefore, in this study, we will fit a regression model to the Census data using ‘Average Income’ as the dependent variable by considering spatial autocorrelation in the regression model. The S-Plus command line was used to compute a Simultaneous Autoregressive (SAR) model so as to describe the relationship between ‘Average Income’ and the other independent variables. This paper includes many statistical techniques applied to obtain the spatial regression model. It also attempts to compare the results obtained from the simple linear regression (in Assignment 1) with the model obtained in this study.

 Read entire article (It is password protected. Please contact me for liable access to this article.)