Inserting Custom Channels via layer_lambda() into a Neural Network

I'm using Keras to build deep CNNs, and I want to include a set of channels to each layer that will act as a coordinate system for every point. That is, convolution operations will know "where" a pixel is by the activity that it sees at each point in the coordinate channels. Here is the code for creating the coordinate channels:

library(keras)
library(EBImage)

coordinates <- function(dim_x, dim_y) {
  map_x <-   replicate(dim_y, (0.5-dim_x/2):(dim_x/2-0.5))  * 2 / dim_x
  map_y <- t(replicate(dim_x, (0.5-dim_y/2):(dim_y/2-0.5))) * 2 / dim_y
  map_r <- sqrt(map_x^2 + map_y^2)

  # Each channel highlights a single coordinate position (x, y, r).
  num <- 2
  sep <- 1 / num
  maps <- array(data = 0, dim = c(dim_x, dim_y, 5 * num + 3))
  c <- 0
  for (x_pos in (-num:num)/num) {
    c <- c + 1
    maps[,,c] <- 1 - num * abs(map_x - x_pos)
  }
  for (y_pos in (-num:num)/num) {
    c <- c + 1
    maps[,,c] <- 1 - num * abs(map_y - y_pos)
  }
  for (r_pos in (0:num)/num) {
    c <- c + 1
    maps[,,c] <- 1 - num * abs(map_r - r_pos)
  }
  maps[maps < 0] <- 0
  
  return(maps)
}

maps <- coordinates(75, 125)

img <- cbind(rbind(maps[,, 1], maps[,, 2], maps[,, 3], maps[,, 4], maps[,, 5]),
             rbind(maps[,, 6], maps[,, 7], maps[,, 8], maps[,, 9], maps[,,10]),
             rbind(0*maps[,,10], maps[,,11], maps[,,12], maps[,,13], 0*maps[,,13]))
EBImage::display(img)

Here's the output:

image

My problem is that I can't figure out how to insert this array into the network. The input to the network has shape (?, 75, 125, 3), and the coordinate channels have shape (75, 125, 13). After inserting the coordinates, it should become (?, 75, 125, 16). However, running the code below seems to lead to an infinite loop.

input <- layer_input(shape = c(75, 125, 3)) %>%
  layer_lambda((function (x) k_concatenate(c(x, maps))))

Does the k_concatenate() function have a problem with 'maps' not being a Tensor? (Or not having a NULL first dimension?) If so, how should I address this? Would there be some way to sneak in the channels via an auxiliary input? Any help is appreciated.

Update:
When I modify the network code to the following:

input <- layer_input(shape = c(75, 125, 8)) %>%
  layer_lambda((function (x) k_concatenate(c(x, k_constant(maps)))))

it no longer enters an infinite loop. But now it gives me the following error message:

Error in py_call_impl(callable, dots$args, dots$keywords) : 
  RuntimeError: Evaluation error: ValueError: Shape must be rank 4 but is rank 3 for 'lambda_8/concat' (op: 'ConcatV2') with input shapes: [?,75,125,3], [75,125,13], [].

Detailed traceback: 
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 1956, in concatenate
    return tf.concat([to_dense(x) for x in tensors], axis)
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/tensorflow/python/util/dispatch.py", line 180, in wrapper
    return target(*args, **kwargs)
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/tensorflow/python/ops/array_ops.py", line 1256, in concat
    return gen_array_ops.concat_v2(values=values, axis=axis, name=name)
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/tensorflow/python/ops/gen_array_ops.py", line 1149, in concat_v2
    "ConcatV2", values=values, axis=axis, name=name)
  File "/hom

It seems like I need to give it a null first dimension when converting to a k_constant, but I am not sure how to do this.

This describes what I want to do now:

maps <- k_constant(coordinates(75, 125))
maps <- k_reshape(maps, list(NULL, dim(maps)[1], dim(maps)[2], dim(maps)[3]))

However, this gives me an error:

Error in py_call_impl(callable, dots$args, dots$keywords) : 
  TypeError: Failed to convert object of type <type 'tuple'> to Tensor. Contents: (None, 75, 125, 13). Consider casting elements to a supported type.

Detailed traceback: 
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 1969, in reshape
    return tf.reshape(x, shape)
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/tensorflow/python/ops/gen_array_ops.py", line 7179, in reshape
    "Reshape", tensor=tensor, shape=shape, name=name)
  File "/home/rstudio-user/.virtualenvs/r-tensorflow/lib/python2.7/site-packages/tensorflow/python/framework/op_def_library.py", line 514, in _apply_op_helper
    raise err

I'm not sure how to fix this. Any other reshape either keeps the 'maps' Tensor 3-dimensional, or it fixes the batch size dimension to 1. I wan the batch size (first/samples) dimension to remain NULL so that it can combine with variable batch sizes during training.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.