Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Feature that can go back and forward in rewrite history #874

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

ricardoV94
Copy link
Member

@ricardoV94 ricardoV94 commented Jun 28, 2024

Description

I can imagine this being very powerful for debugging or reasoning about rewrites?

import pytensor
import pytensor.tensor as pt
from pytensor.graph.fg import FunctionGraph
from pytensor.graph.features import FullHistory
from pytensor.graph.rewriting.utils import rewrite_graph

x = pt.scalar("x")
out = pt.log(pt.exp(x) / pt.sum(pt.exp(x)))

fg = FunctionGraph(outputs=[out])
history = FullHistory()
fg.attach_feature(history)

rewrite_graph(fg, clone=False, include=("canonicalize", "stabilize"))

# Replay rewrites
history.start()
pytensor.dprint(fg)
pytensor.config.optimizer_verbose = True
for i in range(3):    
    print()
    print(">>> ", end="")
    pytensor.dprint(history.next())
        
# Log [id A] 4
#  └─ True_div [id B] 3
#     ├─ Exp [id C] 2
#     │  └─ x [id D]
#     └─ Sum{axes=None} [id E] 1
#        └─ Exp [id F] 0
#           └─ x [id D]
# >>> MergeOptimizer
# Log [id A] 3
#  └─ True_div [id B] 2
#     ├─ Exp [id C] 0
#     │  └─ x [id D]
#     └─ Sum{axes=None} [id E] 1
#        └─ Exp [id C] 0
#           └─ ···
# >>> local_mul_canonizer
# Log [id A] 1
#  └─ Softmax{axis=None} [id B] 0
#     └─ x [id C]
# >>> local_logsoftmax
# LogSoftmax{axis=None} [id A] 0
#  └─ x [id B]

# Or in reverse
for i in range(3):
    print()
    print(">>> ", end="")
    pytensor.dprint(history.prev())

# >>> local_logsoftmax
# Log [id A] 1
#  └─ Softmax{axis=None} [id B] 0
#     └─ x [id C]
# >>> local_mul_canonizer
# Log [id A] 3
#  └─ True_div [id B] 2
#     ├─ Exp [id C] 0
#     │  └─ x [id D]
#     └─ Sum{axes=None} [id E] 1
#        └─ Exp [id C] 0
#           └─ ···
# >>> MergeOptimizer
# Log [id A] 4
#  └─ True_div [id B] 3
#     ├─ Exp [id C] 2
#     │  └─ x [id D]
#     └─ Sum{axes=None} [id E] 1
#        └─ Exp [id F] 0
#           └─ x [id D]
    
pytensor.config.optimizer_verbose = False
# Or go to any step
pytensor.dprint(history.goto(2))
# Log [id A] 1
#  └─ Softmax{axis=None} [id B] 0
#     └─ x [id C]

Related Issue

  • Closes #
  • Related to #

Checklist

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

@ricardoV94 ricardoV94 added enhancement New feature or request graph rewriting labels Jun 28, 2024
import pytensor
import pytensor.tensor as pt
from pytensor.graph.fg import FunctionGraph
from pytensor.graph.features import History, FullHistory
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
from pytensor.graph.features import History, FullHistory
from pytensor.graph.features import FullHistory

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@ricardoV94 ricardoV94 marked this pull request as draft July 5, 2024 19:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request graph rewriting
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant