Skip to content

[gemini] remove distributed-related part from colotensor #4378

@ver217

Description

@ver217

Motivation

As tensor parallelism is implemented by shardformer, ColoTensor is not needed to be regard as distributed tensor anymore.

Overview

ColoParam should be remained, but all components related to distributed training can be removed.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions