Combine two neural networks with different inputs through element-wise summation of certain layers
up vote
0
down vote
favorite
I am looking at combining two Convolutional Neural Networks into one through element-wise summation of activation functions. Both these networks have different inputs, but are similar in their architecture.
I have seen from certain papers and github pages that this has been, successfully, implemented in Python. However, I was wondering if this would also be possible to implement in MATLAB?
One example of what I want to reproduce is the FuseNet architecture by Hazirbas et al. https://github.com/zanilzanzan/FuseNet_PyTorch:
Is it possible to reproduce this in MATLAB, and if so, how do I start?
matlab neural-network computer-vision classification semantic-segmentation
add a comment |
up vote
0
down vote
favorite
I am looking at combining two Convolutional Neural Networks into one through element-wise summation of activation functions. Both these networks have different inputs, but are similar in their architecture.
I have seen from certain papers and github pages that this has been, successfully, implemented in Python. However, I was wondering if this would also be possible to implement in MATLAB?
One example of what I want to reproduce is the FuseNet architecture by Hazirbas et al. https://github.com/zanilzanzan/FuseNet_PyTorch:
Is it possible to reproduce this in MATLAB, and if so, how do I start?
matlab neural-network computer-vision classification semantic-segmentation
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am looking at combining two Convolutional Neural Networks into one through element-wise summation of activation functions. Both these networks have different inputs, but are similar in their architecture.
I have seen from certain papers and github pages that this has been, successfully, implemented in Python. However, I was wondering if this would also be possible to implement in MATLAB?
One example of what I want to reproduce is the FuseNet architecture by Hazirbas et al. https://github.com/zanilzanzan/FuseNet_PyTorch:
Is it possible to reproduce this in MATLAB, and if so, how do I start?
matlab neural-network computer-vision classification semantic-segmentation
I am looking at combining two Convolutional Neural Networks into one through element-wise summation of activation functions. Both these networks have different inputs, but are similar in their architecture.
I have seen from certain papers and github pages that this has been, successfully, implemented in Python. However, I was wondering if this would also be possible to implement in MATLAB?
One example of what I want to reproduce is the FuseNet architecture by Hazirbas et al. https://github.com/zanilzanzan/FuseNet_PyTorch:
Is it possible to reproduce this in MATLAB, and if so, how do I start?
matlab neural-network computer-vision classification semantic-segmentation
matlab neural-network computer-vision classification semantic-segmentation
edited Nov 8 at 14:16
Dev-iL
16.1k63974
16.1k63974
asked Nov 8 at 11:13
Isa El Doori
91
91
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
You might be able to do this using a DAG network1,2 in MATLAB. Here's an illustration:
The element-wise summation, specifically, can be performed using an additionLayer
.
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
You might be able to do this using a DAG network1,2 in MATLAB. Here's an illustration:
The element-wise summation, specifically, can be performed using an additionLayer
.
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
add a comment |
up vote
0
down vote
You might be able to do this using a DAG network1,2 in MATLAB. Here's an illustration:
The element-wise summation, specifically, can be performed using an additionLayer
.
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
add a comment |
up vote
0
down vote
up vote
0
down vote
You might be able to do this using a DAG network1,2 in MATLAB. Here's an illustration:
The element-wise summation, specifically, can be performed using an additionLayer
.
You might be able to do this using a DAG network1,2 in MATLAB. Here's an illustration:
The element-wise summation, specifically, can be performed using an additionLayer
.
answered Nov 8 at 13:56
Dev-iL
16.1k63974
16.1k63974
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
add a comment |
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
Hi Dev-iL, I was able to reproduce this network. However, since the DAGNetwork only allows for one input, my question then becomes: would it be possible to have a 4-channel input, 3 channels go to one side of the network, and the remaining channel goes to the other side of the network?
– Isa El Doori
Nov 8 at 17:43
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
@IsaElDoori I don't know of an easy way to achieve that. What you could do, is define a custom layer that does this "unzipping". Basically the reverse of a depth concatenation layer. Either that, or a custom layer that all it does is forward certain dimensions (or slices) of the input forward.
– Dev-iL
Nov 8 at 18:55
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53206600%2fcombine-two-neural-networks-with-different-inputs-through-element-wise-summation%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown