Memory problem with processing satellite images

Hello for everyone,

I'm trying to process Landsat 8 images in R, but in the first step (importing the images) I've a problem using a function readGDAL(), according the code below:

library(landsat8)
library(sp)
library(raster)
library(rgdal)
library(dplyr)
library(glue)

names <- c("B1","B2","B3","B4","B5","B6","B7","B8","B9","B10","B11")
band <- list()
for (i in names) {
  caminho <- "C:/Users/ERLI/Downloads/LC08_L1TP_215073_20190602_20190602_01_T1/"
  setwd(caminho)
  band[i] <- readGDAL(glue("LC08_L1TP_215073_20190602_20190602_01_T1_{i}.tif"))
}

And, after the 7th image, the following error message is shown:

LC08_L1TP_215073_20190602_20190602_01_T1_B1.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B2.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B3.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B4.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B5.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B6.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B7.tif has GDAL driver GTiff 
and has 7751 rows and 7641 columns
LC08_L1TP_215073_20190602_20190602_01_T1_B8.tif has GDAL driver GTiff 
and has 15501 rows and 15281 columns
Error: cannot allocate vector of size 903.6 Mb

I've used the library raster, and his function raster() but the product generated have a formal class (RasterLayer) that isn't possible to use in a next step. I'm using a desktop with Windows 7, 64-bits build, 500 GB of HD and 4 GB of RAM.

How I can process landsat images with this computer's configuration?

You won't be able to load all your data in memory if it take much space than available.
You are currently creating a list of all your tif data in memory - this won't be possible it seems. Do you need to load all you data at the same time to process them ?

If not, you can process them by pieces and writing to disk for caching when you are finished with a piece.

Otherwise, you'll need a bigger machine, or another processing tool for those kind of data. I am not an expert in those kind of format, so I can"t point you to anything. Maybe someone will

1 Like

Dear @cderv, I've processed all data (.tif images) in steps, but without creating variables: calling the file in rigid disk and writing in it.
For example, to obtain TOA (Top of Atmosphere) reflectance:

for (i in band[1:11]) {
  writeRaster(raster(radconv(readGDAL(glue("LC08_L1TP_215073_20190602_20190602_01_T1_B{i}.tif")),
                               radiancia[as.numeric(i),2], radiancia[as.numeric(i),1])),
              filename = glue("LC08_L1TP_215073_20190602_20190602_01_T1_B{i}_L.tif"), drivername = "GTiff")
}

Thank you, for recommendations!

1 Like

Glad I could help !

If your question's been answered (even by you!), would you mind choosing a solution? It helps other people see which questions still need help, or find solutions if they have similar problems. Here’s how to do it:

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.