You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am starting my journey in ML and Julia. I do not know if this is the correct place to ask question related to LUX so please forgive me.
I would like to implement a custom layer to use in Lux and later use it in a neuralODE.
My task is to achieve the following:
My input is a 9x1 matrix. I want to use two dense layers to predict another 9x1 matrix.
I want it to be pass through an "eigen filter" to be positive defined.
From the documentation in [https://docs.juliahub.com/Lux/Jbrqh/0.4.12/manual/interface/] it seems that a container layer would be a good starting point for this.
Here is an example [if the values end up being imaginary do not mind, it is a conceptual example]:
struct myComposedLinear{L1, L2} <:Lux.AbstractExplicitContainerLayer{(:dense_1, :dense_2)}
dense_1::L1
dense_2::L2endfunction (cl::myComposedLinear)(x::AbstractMatrix, ps, st::NamedTuple)
y, st_l1 = cl.dense_1(x, ps.dense_1, st.dense_1)
y, st_l2 = cl.dense_2(y, ps.dense_2, st.dense_2)
mat =zeros(3,3)
mat[1,1] = y[1];
mat[1,2] = y[2];
mat[1,3] = y[3];
mat[2,1] = y[4];
mat[2,2] = y[5];
mat[2,3] = y[6];
mat[3,1] = y[7];
mat[3,2] = y[8];
mat[3,3] = y[9];
w, v =eigen(mat)
x_tmp =Diagonal(abs.(w))
result = v * x_tmp *inv(v)
returnreshape(result,9,1), (dense_1 = st_l1, dense_2 = st_l2)
end
rng = Random.default_rng()
Random.seed!(rng, 1111)
model =myComposedLinear(Lux.Dense(9, 18), Lux.Dense(18, 9))
ps, st = Lux.setup(rng, model)
x =rand(rng, Float32, 9, 1)
result = Lux.apply(model, x, ps, st)[1]
My question(s) [please forgive my lack of knowledge, I am starting out] :
Is this a valid implementation? Can I create empty arrays inside myComposedLinear (zeros(3,3))? Will back-propagation work well if I create (N,N) zero arrays that are later populated?
Best Regards,
The text was updated successfully, but these errors were encountered:
A quick follow up on mat = reshape(y, 3, 3) . How would it handle a symmetric matrix if I supply 6 independent components?
I know from symmetry that mat=[y[1] y[2] y[3]; y[2] y[4] y[5]; y[3] y[5] y[6]]. Would this be acceptable for AD? Or there is no chance in here?
Hi,
I am starting my journey in ML and Julia. I do not know if this is the correct place to ask question related to
LUX
so please forgive me.I would like to implement a custom layer to use in Lux and later use it in a neuralODE.
My task is to achieve the following:
My input is a 9x1 matrix. I want to use two dense layers to predict another 9x1 matrix.
I want it to be pass through an "eigen filter" to be positive defined.
From the documentation in [https://docs.juliahub.com/Lux/Jbrqh/0.4.12/manual/interface/] it seems that a container layer would be a good starting point for this.
Here is an example [if the values end up being imaginary do not mind, it is a conceptual example]:
My question(s) [please forgive my lack of knowledge, I am starting out] :
Is this a valid implementation? Can I create empty arrays inside
myComposedLinear
(zeros(3,3)
)? Will back-propagation work well if I create (N,N) zero arrays that are later populated?Best Regards,
The text was updated successfully, but these errors were encountered: