-
Notifications
You must be signed in to change notification settings - Fork 3.8k
More Strongly-Typed Rust Bindings #2559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Thank you for testing out and making nice suggestions! any suggestions is more than welcome of course :) Fyi @nhynes has been working on some more ergonomic and enhancements for
So far I've kept the compatibility with other frontends (python, java and golang) in mind, but that doesn't mean we can't add more. Off the top of my head, when #2498 is merged it could open up some good opportunities in Rust. |
|
Hey @kazimuth thanks a lot for the PR! This is really cool! It looks like you've made a lot of overlapping changes with my branch which unifies |
| let mut module = GraphModule::from_paths( | ||
| concat!(env!("CARGO_MANIFEST_DIR"), "/deploy_graph.json"), | ||
| concat!(env!("CARGO_MANIFEST_DIR"), "/deploy_lib.so"), | ||
| concat!(env!("CARGO_MANIFEST_DIR"), "/deploy_param.params"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
note that the established pattern is to load the params independently of the graph lib+json since one might want to load multiple params sets (possibly over RPC, too). The Rust runtime crate already implements this pattern, so we should try to use that GraphModule but substitute in @ehsanmok's bindings when requested.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense, I was just aiming to minimize lines of example code.
I wasn't sure about providing from_paths at all because it won't work in e.g. wasm. Might make sense to put behind a #[cfg].
| .collect(), | ||
| ) | ||
| static ref GLOBAL_FUNCTIONS: Mutex<HashMap<String, Function>> = { | ||
| Mutex::new(HashMap::new()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you provide a bit of motivation for not pre-allocating the map with the already known and presumed fixed set of globals?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe TVMFuncRegisterGlobal is meant for adding new global functions at runtime, which means that the set of names may change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, they could change, but functions can also be removed which makes caching at all ill advised. The de-facto semantics of TVM globals is static registration. Always re-wrapping pointers might be an option but I'm not sure if the performance cost would be worth the (unused) additional flexibility.
On the bright side, part of the reason for the rust runtime has been to remove these pesky globals. As we further expand the rust functionality, we won't have to think as much about C++ interop.
Once I finish up unifying the tvm crate and merging in your PR, a good next area of focus might be on creating device apis to enable use cases like GPUs (let's start with AMD because you seem to have a real use case) and VTA.
| &mut handle as *mut _ | ||
| )); | ||
| if handle.is_null() { | ||
| bail!(ErrorKind::FunctionNotFound(name.to_string())); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just FYI can use ensure instead of if ... { bail!() }
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ooh, nice
| /// | ||
| /// Note that only the last segment of the function path is used for the function name. | ||
| #[macro_export] | ||
| macro_rules! wrap_packed_globals { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd probably go as far as making this an actual Fn type (c.f. runtime::PackedFunc)
| let get_output_ = module.get_function("get_output", false).expect("no get_output, invalid GraphModule"); | ||
| let get_num_outputs_ = module.get_function("get_num_outputs", false).expect("no get_num_outputs, invalid GraphModule"); | ||
| let run_ = module.get_function("run", false).expect("no run_, invalid GraphModule"); | ||
| let load_params_ = module.get_function("load_params", false).expect("no load_params_, invalid GraphModule"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ideally we should have exactly one GraphModule which either uses a Rust Module implementation or shells out to the bindings. Code duplication is decidedly less than maintainable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My understanding was that Module is a type wrapping a dynamic library that could have any number of named functions, e.g. one created with tvm.contrib.cc; whereas GraphModule is a particular API that some Modules expose. So it made sense to make GraphModule a simple typed Module wrapper.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
kind of. there is really no graph api inside of the module: it's purely defined by the graph.json. For instance, a module with dense1, batch_norm, and softmax could create a graph with {dense1, `softmax} just as easily as with all three. Basically the Module is a repo of functions and the graph.json determines how those are actually called.
| Ok(GraphModule { module, set_input_, get_input_, get_output_, get_num_outputs_, run_, load_params_ }) | ||
| } | ||
|
|
||
| // TODO: should these functions return Result? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes. for instance, loading params can fail if the user provides a bogus params file.
| pub mod bytearray; | ||
| pub mod context; | ||
| #[allow(deprecated)] | ||
| pub mod errors; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| let image: ndarray::Array = /* ... load image in Rust ... */; | ||
|
|
||
| // convert to a tvm NDArray | ||
| // could be on a GPU, FPGA, ... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mind if I ask which exactly you're trying to use? ;)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
my dream is to run deep learning stuff on my amd-gpu desktop, which I've generally found to be nigh-impossible without tvm ;-;
|
also, rustfmt-on-save using rust.vim is 👌 |
|
Thanks for the advice @nhynes, I'll delay until you're done with that change -- didn't see your PR :) and no rush, I was just poking at things and figured I could chip in. Are you planning on unifying |
|
Hey @nhynes, any update on your PR? No rush, just curious |
|
Hey @kazimuth super sorry for the long delay. I was planning on taking a TVM day tomorrow and Sunday. I don't want to block your contribution since it contains a number of useful improvements, so if I don't have the merge done by Monday, I'll merge this and rebase. Thanks! |
|
@kazimuth Would you want to still work on adding |
|
I think that what we actually want is a @kazimuth if you're still interested in working on improving the frontend graph runtime, we should discuss the design of this component. It'd be great if we could start using conditional compilation to make a single |
|
@nhynes In case of frontend, I think |
|
I'm not disagreeing with the utility, but the name leaves to be desired since it's decisively not a module in the TVM sense. Regardless of nomenclature, we only need one thing that provides the graph runtime/module/executor abstraction. |
|
If you can give me a broad-strokes design (like some type names + general APIs) I'd be happy to implement it.
I'm not sure I understand what this means, could you elaborate? |
|
So from a high level I mean something like impl GraphExecule { // TODO: name
fn set_input(&mut self, input: Tensor) {
#[cfg(feature = "bindings")]
{
module.get_function("set_input", ...);
}
#[cfg(not(feature = "bindings")]
{
// pure rust impl
}
}
// ...
} |
|
ah, i guess i'm still a little confused about the distinction between bindings and pure-rust; i thought that distinction had been eliminated. Is pure-rust just for, like, the wasm and sgx runtimes, or is it going to have a bigger application area? Also, looking at Side note, I think I'm going to close this PR and start a new one for the new changes. |
It's a constant WIP. The two crates now share common types, but I suspect that there's a decent amount of duplicated logic. I'd also like for users to be able to do Actually, if you're feeling ambitious, you could actually go and implement a shared object module in Also, I remember you saying that you wanted an easy-to-use Rust ML library. I had a neat idea that you might like: an option to build the entire GraphExecutor at compile-time from a pre-defined TVM module and graph json. Like there'll be a macro that generates all of the |
Is it possible to call a PackedFunc from a deployed library without using C++? Don't you need to go through the C runtime API to call any PackedFunc? Or is that actually implemented somewhere in the rust codebase?
Oh that is an interesting idea. It would be easier if GraphExecutor was already fully-rust too. Question: Is the current deploy_graph.json / deploy_lib.so format stable, or is it going to be replaced by a new format to support Relay? I was under the impression it was based on the old NNVM graphs, not sure it's worth the effort to port if it's going to be immediately deprecated. |
Yes, if there were a way to load the functions from a dynamic library. The standalone runtime already does this for statically linked module libraries.
Well, strictly speaking, the GraphExecutor only exist in the standalone (Rust) runtime :)
Yes, Relay produces the same output as NNVM. At some point, we might need to add support for the more interesting control flow operators, though. That's still WIP on the C++ side AFAIK. |
|
Closing this because it's bitrotted, will re-start this work later |
|
Hi @kazimuth , just pinging to see if you're still interested and would like to propose a deadline for it? |
|
I am but my workload's picked up a bit, might be a few weeks before i get back to this :) |
After reading #2523 I was messing with the newly-merged Rust API and noticed some some changes that might be helpful.
The main change this PR makes is to start adding some "typed" rust APIs, which look like normal rust functions / structs, and call tvm PackedFuncs under the hood. The rust graph runtime example is now much simpler:
Of course, all the old PackedFunc functionality is still there. This just makes well-defined APIs a little easier to use.
This approach could be used for binding the non-runtime APIs as well, I think. (Which I want, because I kinda wanna make a tvm-backed rust deep learning framework at some point... that's the far future though ;))
I've also fixed up a few other random things I've noticed:
Function::getimplementation would break for freshly registered globals, so I made it more dynamickNDArrayContainers to be returned from PackedFuncs without blowing up... probably. The example works, anyway. I don't entirely understand what's going on with the stuff incommon::valuethough.TODO:
Open questions:
CC @ehsanmok , hope it's okay that i'm poking around in your code :)