Skip to content

Conversation

@PariksheetPinjari909
Copy link
Contributor

LRN operator for caffe alexnet model to use with onnx frontend

@PariksheetPinjari909 PariksheetPinjari909 changed the title LRN Operator [WIP]LRN Operator Mar 13, 2018
@PariksheetPinjari909 PariksheetPinjari909 changed the title [WIP]LRN Operator LRN Operator Mar 14, 2018
@tqchen
Copy link
Member

tqchen commented Mar 15, 2018

@sxjscience can you help review this?

@tqchen tqchen requested a review from sxjscience March 15, 2018 02:18
@tqchen tqchen changed the title LRN Operator [TOPI] LRN Operator Mar 15, 2018
@sxjscience
Copy link
Member

I'll check the formula next week... This week I decide to focus on the MT demo.

@sxjscience
Copy link
Member

Looks that the functionality can be implemented by stacking multiple reduce/broadcast ops in TOPI.

@PariksheetPinjari909
Copy link
Contributor Author

@sxjscience Using broadcast/reduce I feel it will be difficult since we need to compute for a selected area based on the "local size" parameter, thats why i used reduce_axis to do the sum operation. Can you give a reference how to use broadcast/reduce operator in this scenario?

@sxjscience
Copy link
Member

sxjscience commented Mar 19, 2018 via email

@PariksheetPinjari909
Copy link
Contributor Author

I will upload L2norm operator implementation also along with this pull request, please help to review together.

@PariksheetPinjari909 PariksheetPinjari909 changed the title [TOPI] LRN Operator [TOPI] LRN & L2norm Operator Mar 19, 2018
@tqchen
Copy link
Member

tqchen commented Mar 25, 2018

ping @sxjscience can you take another look

import tvm

@tvm.target.generic_func
def l2norm_instance_nchw(data, eps):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should implement L2Norm with arbitrary shape and axis.


##Add padding on left & right of size radius first
pad_before = [0, int(size/2), 0, 0]
pad_after = [0, int(size/2), 0, 0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use size // 2 instead of int(size/2)

rxk = tvm.reduce_axis((0, size), name='rxk')
sqr_sum = tvm.compute((b, c, h, w), lambda i, l, j, k: tvm.sum(
pad_data[i, l + rxk, j, k] * pad_data[i, l + rxk, j, k],
axis=rxk))
Copy link
Member

@sxjscience sxjscience Mar 25, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can first use square and then use topi.sum, for example: I've misunderstood the code. We cannot do this.

lrn_out[i, c, j, k] = a_np[i, c, j, k] / \
sqr_sum_up[i, c, j, k]

return lrn_out
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should move the python implementation to test_topi_lrn.py.

b = tvm.nd.array(np.zeros(get_const_tuple(B.shape), dtype=dtype), ctx)
f = tvm.build(s, [A, B], device)
f(a, b)
np.testing.assert_allclose(b.asnumpy(), b_np, rtol=1e-1)
Copy link
Member

@sxjscience sxjscience Mar 25, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel that rtol=1e-1 is too loose. We should set it to a smaller value like 1E-4 or 1E-3.


@tvm.target.generic_func
def l2norm_instance_nchw(data, eps):
"""Perform local response normalisation on the data
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is L2Norm instead of local response normalization.


@tvm.target.generic_func
def lrn_nchw(data, size, alpha=0.0001, beta=0.75, bias=2):
"""Perform local response normalisation on the data
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(bias + (alpha * sqr_sum[i, j, k, l] / size)), beta))

return tvm.compute(data.shape,
lambda b, c, h, w: data[b, c, h, w] / sqr_sum_up[b, c, h, w])
Copy link
Member

@sxjscience sxjscience Mar 25, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can reuse the implemented binary broadcasting operators like topi.broadcast_div.

@PariksheetPinjari909
Copy link
Contributor Author

PariksheetPinjari909 commented Mar 26, 2018

@tqchen can you please reopen this PR, i accidentally closed it.

@PariksheetPinjari909
Copy link
Contributor Author

Review comments are updated in #1051

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants