-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Draft for hugr-model
with export, import, parsing and pretty printing
#1542
base: main
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1542 +/- ##
==========================================
- Coverage 87.43% 85.71% -1.72%
==========================================
Files 127 132 +5
Lines 21740 24017 +2277
Branches 18740 21017 +2277
==========================================
+ Hits 19008 20587 +1579
- Misses 1964 2366 +402
- Partials 768 1064 +296
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
first review of mod.rs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some general comments, I'll do mod/export next.
Other tidbits:
- Add
hugr-model
to.github/change-filters.yml
Co-authored-by: Agustín Borgna <[email protected]>
Co-authored-by: Agustín Borgna <[email protected]>
Co-authored-by: Agustín Borgna <[email protected]>
Also added hugr model to change-filters.yaml
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(Looked at parse.rs
and print.rs
.) LGTM, I just highlighted some TODO comments that you may or may not want to address here (if not we should make issues).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import first pass, second half coming soon
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import round 2
hugr-core/src/import.rs
Outdated
Err(error_uninferred!("application with implicit parameters")) | ||
} | ||
|
||
model::Term::ApplyFull { name, args } => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
while I understand the rationale I find the encoding of runtime custom types as "applyfull" still confusing on first interaction. Either a rename or some comments here would help.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've added some docs by now. Does it become clearer?
Co-authored-by: Seyon Sivarajah <[email protected]>
let mut module = model::Module::default(); | ||
module.nodes.reserve(hugr.node_count()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a Module::with_capacity
instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nodes, terms and regions have different capacities, so that'd be a with_capacity
with quite a few parameters (all of the same type, don't mix them up!).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then maybe a reserve_nodes
method instead, that can be documented instead of relying on knowing how the Model works internally.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There isn't really an internally. It's plain old data. Everything is pub.
} | ||
} | ||
|
||
pub fn export_root(&mut self) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is the logic for the root module of the Hugr different from the node's?
This seems to assume the hugr root is a Module.
As far as I understand, passing a DFG-rotted hugr here will fail when calling self.export_node(child)
on the input / output nodes.
Some root Optypes like Case
and ExitBlock
should be detected early and rejected here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The core root module node is exported not as a node but as a region. I was not aware that there can be Hugr
s whose roots are not Module
s...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I propose we postpone this: #1554
This PR defines a draft of the model format, and the following parts that operate on the format:
This PR is very big, and yet it does not present a completed feature. The model and core format differ in several aspects. Where they differ, this is mostly intentional: The model is kept flexible enough to incorporate features that we know that we want in the future but are currently hard or impossible to express in the core. Import and export perform conversions where possible, but do not yet cover everything.
To prevent this PR from becoming even bigger and to allow for core and model to converge incrementally from both directions, I suggest the following plan: We merge this PR into main with the model related code being feature gated and considered experimental. We then perform tweaks to model and core in smaller PRs where necessary until they are sufficiently close.