Conversation
sufeidechabei
commented
Sep 19, 2019
- Performance can't match the orginal paper (I don't have enough GPUS to train Imagenet. )
- It does't support hybrid now because mxnet doesn't have same padding.
zhanghang1989
left a comment
There was a problem hiding this comment.
Please add unitest for testing model forward
|
How to add it, can you give me an example @zhanghang1989 |
|
@sufeidechabei You can use the symbol.Pooling op to emulate Same pad as it has param |
|
It only support Symbol instances. |
|
@sufeidechabei You can wrap a hybridblock to it or you can use hybridlambda. |
|
I used hybridblock to wrap as you say @chinakook |
|
Like @chinakook said please try to remove usage of |
|
Link to model unitest: |
@sufeidechabei I think you may allow an argument during the init function to fix the input shape |
| input_filters=int(options['i']), | ||
| output_filters=int(options['o']), | ||
| expand_ratio=int(options['e']), | ||
| id_skip=('noskip' not in block_string), |
There was a problem hiding this comment.
id_skip is not used ?
|
Okay, I will fix that. |
1 similar comment
|
Okay, I will fix that. |
| _add_conv( | ||
| self._se_reduce, | ||
| num_squeezed_channels, | ||
| active=False, |
There was a problem hiding this comment.
I believe there should be activation function in SE module.
There was a problem hiding this comment.
I follow the pytorch efficientnet while it doesn't have activation function in SE block.
There was a problem hiding this comment.