Need to grab 3x3 neighborhood of input cell from 2d numpy array

I am trying to define a function that will return the 3x3 neighborhood of an input cell. Right now I have:

def queen_neighbourhood(in_forest, in_row, in_col):

    neighbourhood = in_forest[in_row-1:in_row+1, in_col-1:in_col+1]

    return neighbourhood

      

(in_forest is the input array).

When I run this it seems to return a 2x2 matrix instead of a 3x3. Why is this? It looks to me like I was injecting a row and column reference and then cutting out a square that starts one row after the input row and ends one row in front, then the same for the columns.

So, for example, given the input array as such:

[ 01, 02, 03, 04, 05
  06, 07, 08, 09, 10
  11, 12, 13, 14, 15
  16, 17, 18, 19, 20
  21, 22, 23, 24, 25 ]

      

And then using line 2, col 3, I want to return the matrix as such:

[ 02, 03, 04
  07, 08, 09
  12, 13, 14 ]

      

+3


source to share


1 answer


When you say in_forest[in_row-1:in_row+1, in_col-1:in_col+2]

, you say, “Give me a square from in_row-1

inclusive to in_row+1

exclusive and in_col-1

inclusive to in_col+2

exclusive. It cuts, but does not include the second index.



Just use in_row-1:in_row+2

and in_col-1:in_col+2

instead to slice, including "+1".

+5


source







All Articles