Setting up Sublimetext3

In this blog post, I will talk about how I set up Sublimetext3 for my work. I recently changed my main work computer, and I spent quite some time to configure it. This is more of a self-note so I do not have to spend as much time next time I need to configure it on a new computer.

Install the Packages Control package The very first task you need to complete after installing Sublimetext is to install the Packages Control package.

Do you really need to get the inverse of that weight matrix?

Motivation When you run simulations with spatially dependent data, you can use a weight matrix to quantify the nature and degree of dependence between observations. For example, suppose you would like to generate spatially correlated shocks (errors), (\varepsilon), specified as follows: [ \varepsilon = \lambda W \cdot \varepsilon + u ] where (W) is the weight matrix, (\lambda \in [0,1]) is a parameter that determines the degree of spatial dependence, and (u \sim N(0,\sigma^2)).

Converting Raster (Gridded) Data to SpatialPolygons (and then to an sf object)

In this short blog post, I will show three ways to convert a raster (gridded) data into polygons. In the course of my research, I faced the need of this type of conversion on multiple occasions. As far as I am aware, there are two options to achieve this operation with (significantly) differing computation time. library(raster) library(rgdal) library(sp) library(sf) We use PRISM () data to demonstrate how we can get raster-to-polygons conversions done.

Downloading data from the NASS Quick Stats website using R

Data provided at NASS Quick Stats is very useful in understanding the history and state of agricultural production at aggregate levels (county and state). Also, they have been used widely in various studies, including the estimation of climate change impacts on crop yield. In this blog post, I will show how to download NASS Quick Stats data from within R using the rnassqs package. While their website is useful, they allows you to download (50,000) records (rows of data) at a time.