Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Custom interpolation layer. Gluon. #10249

Closed
ashkanaev opened this issue Mar 26, 2018 · 2 comments
Closed

Custom interpolation layer. Gluon. #10249

ashkanaev opened this issue Mar 26, 2018 · 2 comments

Comments

@ashkanaev
Copy link

ashkanaev commented Mar 26, 2018

How to realize custom layer (bilinear interpolation) for gluon without input tensor shape?
The issue is hardlink to target size. I want to realize layer for resize arbitrary tensor with some factor.
I found, that SpatialTransformer is what i really need, but target size makes this code inflexible.

Could i realize custom symbol layer with symbol layer?

class Inter(Block):
    def __init__(self, factor, target_size, **kwargs):
        super(Inter, self).__init__(**kwargs)

        self._affine = mx.nd.reshape(mx.nd.array([[1. * factor, 0, 0],
                                 [0, 1. * factor, 0]]), shape=(1, 6))

        self.target_size = target_size



    def forward(self, x):
        out = mx.nd.SpatialTransformer(x,
                                   self.affine_matrix,
                                   self.target_size,
                                   'affine',
                                   'bilinear'
                                   )
@zhanghang1989
Copy link
Contributor

zhanghang1989 commented Apr 9, 2018

I guess this is what you are looking for #9688

import mxnet as mx
x1 = mx.nd.ones(shape=(2,3,4,4))
y1 = mx.nd.contrib.BilinearResize2D(x1, out_height=5, out_width=5)

@szha

@zhanghang1989
Copy link
Contributor

zhanghang1989 commented Apr 9, 2018

class BilinearInterp(Block):
    def __init__(self, factor, target_height, target_width, **kwargs):
        super(BilinearInterp, self).__init__(**kwargs)
        self.height = target_height
        self.width = target_height
    def forward(self, x):
        return mx.nd.contrib.BilinearResize2D(x, self.height, self.width)

@szha szha closed this as completed Apr 9, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants