Skip to content

Support for Offline JavaScript Chaincode Installation#197

Closed
lindluni wants to merge 2 commits intohyperledger:masterfrom
lindluni:offlinechaincode
Closed

Support for Offline JavaScript Chaincode Installation#197
lindluni wants to merge 2 commits intohyperledger:masterfrom
lindluni:offlinechaincode

Conversation

@lindluni
Copy link
Contributor

Background: Chaincode written in Node.js or Typescript requires access to the npm modules from the official NPM registry or a local registry available from the chaincode build container. For networks running behind firewalls with no internet access this poses the problem that the user must set up and maintain a local NPM registry to serve the necessary packages. This is added complexity and maintenance out of band that is difficult for users to configure and maintain.

Proposal: Instead of using a local registry, we modify the fabric-nodeenv image to support npm tarballs created via the npm pack command. The users packages their chaincode in a tarball named chaincode.pkg after adding the bundledDependencies field to their package.json. If the chaincode.pkg file is present, the fabric-nodeenv chaincode builder simply extracts the archive and moves the chaincode source and modules into the proper directory.

A chaincode directory looks like this structure today:

Bretts-MBP:package btl5037$ tree
.
├── index.js
├── lib
│   ├── META-INF
│   │   └── statedb
│   │       └── couchdb
│   │           └── indexes
│   │               └── indexOwner.json
│   └── asset_transfer_ledger_chaincode.js
└── package.json

The user modifies their package.json to include the bundledDependencies field:

{
        "name": "asset-transfer-ledger-queries",
        "version": "1.0.0",
        "description": "asset chaincode implemented in node.js",
        "main": "index.js",
        "engines": {
                "node": ">=12",
                "npm": ">=5.3.0"
        },
        "scripts": {
                "start": "fabric-chaincode-node start"
        },
        "engine-strict": true,
        "license": "Apache-2.0",
        "bundledDependencies": {
                "fabric-contract-api": "^2.0.0",
                "fabric-shim": "^2.0.0"
        },
        "dependencies": {
                "fabric-contract-api": "^2.0.0",
                "fabric-shim": "^2.0.0"
        }
}

The user then runs npm install && npm pack && mv asset-transfer-ledger-queries-1.0.0.tgz chaincode.pkg

The fabric-nodeenv builder would then clean up the directory and extract chaincode.pkg and execute a mv package/* . to stage the extracted Javascript source code and metadata. That's it, chaincode runs totally offline.

Signed-off-by: Brett Logan brett.t.logan@ibm.com

@lindluni
Copy link
Contributor Author

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@lindluni
Copy link
Contributor Author

lindluni commented Aug 24, 2020

@mbwhite This simple modification to the fabric-nodeenv builder makes it possible to install chaincode fully offline, I've tested this to work using the new leger-queries chaincode in fabric-samples. Obviously it would need tests added but wondering what your thoughts are on support for running chaincode offline using the npm pack strategy?

I have tested this method to work while fully detaching my machine from the internet.

@lindluni lindluni marked this pull request as ready for review August 25, 2020 13:53
@lindluni lindluni requested a review from a team as a code owner August 25, 2020 13:53
@mbwhite
Copy link
Member

mbwhite commented Aug 25, 2020

good thinking @btl5037 .. think it certainly is an option

  1. Doesn't work for any modules with native components (as in 1.4 Fabric)
  2. Could get sizable, and if there are limits on the overall package size.

@lindluni
Copy link
Contributor Author

lindluni commented Aug 25, 2020

good thinking @btl5037 .. think it certainly is an option

  1. Doesn't work for any modules with native components (as in 1.4 Fabric)
  2. Could get sizable, and if there are limits on the overall package size.

1: It could be made to work in 1.4 though, right? If I used the chaincode builder (whether ours, or the one specific to deployed environment, like rhel-ubi) to package it docker run -v $(pwd):/data hyperledger/fabric-ccenv:1.4 sh -c "cd /data && npm install && npm pack" the compiled code would be of the correct architecture.

2: Right, that is the major limitation I saw with this method (hitting the gRPC limit), but for a lot of use cases this would be a simple solution to getting off the ground rather than having to go through the entire process of setting up the registry. Packaging my chaincode the tarball was only 4MB as it is compressed (the actual chaincode size was 25MB, so we got decent compression on it). The gRPC limit is ~100MB (minus some metadata), so I could still shove a lot of stuff into there.

It's worth noting we essentially support this in Go chaincode already as well via vendoring. This would be a method we would just have to document well (which we would have to do for the NPM registry anyway). But at least this way we don't have to debug why remote registries are failing. It would take the burden off of us for support for at least some deployments.

@lindluni
Copy link
Contributor Author

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

Brett Logan and others added 2 commits August 25, 2020 13:35
Background: Chaincode written in Node.js or Typescript requires access to the npm modules from the official NPM registry or a local registry available from the chaincode build container. For networks running behind firewalls with no internet access this poses the problem that the user must set up and maintain a local NPM registry to serve the necessary packages. This is added complexity and maintenance out of band that is difficult for users to configure and maintain.

Proposal: Instead of using a local registry, we modify the `fabric-nodeenv` image to support npm tarballs created via the `npm pack` command. The users packages their chaincode in a tarball named `chaincode.pkg` after adding the `bundledDependencies` field to their `package.json`. If the `chaincode.pkg` file is present, the `fabric-nodeenv` chaincode builder simply extracts the archive and moves the chaincode source and modules into the proper directory.

A chaincode directory looks like this structure today:

```
Bretts-MBP:package btl5037$ tree
.
├── index.js
├── lib
│   ├── META-INF
│   │   └── statedb
│   │       └── couchdb
│   │           └── indexes
│   │               └── indexOwner.json
│   └── asset_transfer_ledger_chaincode.js
└── package.json
```

The user modifies their `package.json` to include the `bundledDependencies` field:

```
{
        "name": "asset-transfer-ledger-queries",
        "version": "1.0.0",
        "description": "asset chaincode implemented in node.js",
        "main": "index.js",
        "engines": {
                "node": ">=12",
                "npm": ">=5.3.0"
        },
        "scripts": {
                "start": "fabric-chaincode-node start"
        },
        "engine-strict": true,
        "license": "Apache-2.0",
        "bundledDependencies": {
                "fabric-contract-api": "^2.0.0",
                "fabric-shim": "^2.0.0"
        },
        "dependencies": {
                "fabric-contract-api": "^2.0.0",
                "fabric-shim": "^2.0.0"
        }
}
```

The user then runs `npm install && npm pack && mv asset-transfer-ledger-queries-1.0.0.tgz chaincode.pkg`

The `fabric-nodeenv` builder would then clean up the directory and extract `chaincode.pkg` and execute a `mv package/*  .` to stage the extracted Javascript source code and metadata. That is all that is it, chaincode runs totally offline.

Signed-off-by: Brett Logan <brett.t.logan@ibm.com>
@denyeart denyeart closed this Mar 22, 2021
@denyeart denyeart deleted the branch hyperledger:master March 22, 2021 12:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants