Skip to content

Conversation

@ITHwang
Copy link
Owner

@ITHwang ITHwang commented Apr 2, 2025

Implement basic tensor operations with autograd support

Summary

This PR implements a subset of PyTorch's tensor operations from scratch using NumPy as the computational backend. All operations support automatic differentiation through a define-by-run approach.

Features

  • Core tensor creation functionality with torch.tensor()
  • Basic arithmetic operations (+, -, *, /, negation)
  • Mathematical functions (square(), exp(), pow())
  • Full autograd support with backward propagation
  • Type safety ensuring only floating-point tensors can require gradients
  • Context manager (no_grad()) to disable gradient tracking for inference
  • Memory optimization using weak references to prevent circular reference leaks

Testing

  • Added comprehensive tests for all operations
  • Numerical gradient checking for complex functions
  • Verified memory optimization using before/after measurements

Implementation Notes

  • Follows the Define-by-Run paradigm similar to PyTorch
  • Operations are implemented as Function objects with forward and backward methods
  • Each tensor stores references to its creator function for backpropagation

ITHwang added 3 commits March 4, 2025 00:07
- The graph is serial, each node is in a row
…ate multiple edges

- Adopted generation concept in Tensor & Function
@ITHwang ITHwang requested a review from Copilot April 2, 2025 15:18
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a draft implementation of a computational graph to support automatic differentiation using a new PriorityQueue for managing the backward propagation order. Key changes include:

  • The addition of a generic PriorityQueue in torch/priority_queue.py for handling function ordering during backpropagation.
  • Updates in torch/core.py to use this PriorityQueue and modifications to the backward propagation logic, including support for multi-branch operations.
  • Enhancements in tests, documentation, and CI configuration to support and validate the new design.

Reviewed Changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.

Show a summary per file
File Description
torch/priority_queue.py Adds a generic PriorityQueue implementation for function ordering.
torch/core.py Updates the backward propagation logic using PriorityQueue; refactors Function implementations.
tests/torch/test_core.py Adds a multi-branch backward test to ensure correct gradient propagation.
README.md Expands documentation on core concepts and chain rule implementation.
.pre-commit-config.yaml Updates configuration for pre-commit hooks.
.github/workflows/main.yml Adjusts CI workflow to use pre-commit actions and install pytest.
Comments suppressed due to low confidence (2)

torch/core.py:256

  • Duplicate backward method in the Exp class: the first backward definition that returns y should be removed to avoid confusion and unintended behavior, as the second definition properly handles input validation and gradient computation.
return y

torch/core.py:130

  • The PriorityQueue class does not implement bool, so 'while funcs:' will always evaluate to True. Either implement a bool method for PriorityQueue or change the loop to 'while len(funcs):' to correctly terminate when the queue is empty.
while funcs:

@ITHwang ITHwang marked this pull request as ready for review April 12, 2025 15:44
@ITHwang ITHwang changed the title Feat: (draft) computational graph Feat: computational graph Apr 12, 2025
@ITHwang ITHwang requested a review from Copilot April 12, 2025 15:46
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 13 out of 13 changed files in this pull request and generated 2 comments.

@ITHwang ITHwang merged commit d225583 into main Apr 12, 2025
1 check passed
@ITHwang ITHwang deleted the feat/computational-graph branch April 12, 2025 15:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants