TF Big adds big number support to TensorFlow, allowing computations to be performed on arbitrary precision integers. Internally these are represented as variant tensors of GMP values, and exposed in Python through the tf_big.Tensor
wrapper for convenience. For importing and exporting, numbers are typically expressed as strings.
import tensorflow as tf
import tf_big
# load large values as strings
x = tf_big.constant([["100000000000000000000", "200000000000000000000"]])
# load ordinary TensorFlow tensors
y = tf_big.import_tensor(tf.constant([[3, 4]]))
# perform computation as usual
z = x * y
# export result back into a TensorFlow tensor
tf_res = tf_big.export_tensor(z)
print(tf_res)
Python 3 packages are available from PyPI:
pip install tf-big
See below for further instructions for setting up a development environment.
We recommend using Miniconda or Anaconda to set up and use a Python 3.5 or 3.6 environment for all instructions below:
conda create -n tfbig-dev python=3.6
source activate tfbig-dev
The only requirement for Ubuntu is to have docker installed. This is the recommended way to build custom operations for TensorFlow. We provide a custom development container for TF Big with all dependencies already installed.
Setting up a development environment on macOS is a little more involved since we cannot use a docker container. We need four things:
- Python (>= 3.5)
- Bazel (>= 0.15.0)
- GMP (>= 6.1.2)
- TensorFlow (see setup.py for version requirements for your TF Big version)
Using Homebrew we first make sure that both Bazel and GMP are installed. We recommend using a Bazel version earlier than 1.0.0, e.g.:
brew tap bazelbuild/tap
brew extract bazel bazelbuild/tap --version 0.26.1
brew install gmp
brew install mmv
The remaining PyPI packages can then be installed using:
pip install -r requirements-dev.txt
Run the tests on Ubuntu by running the make test
command inside of a docker container. Right now, the docker container doesn't exist on docker hub yet so we must first build it:
docker build -t tf-encrypted/tf-big:build .
Then we can run make test
:
sudo docker run -it \
-v `pwd`:/opt/my-project -w /opt/my-project \
tf-encrypted/tf-big:0.1.0 /bin/bash -c "make test"
Once the development environment is set up we can simply run:
make test
This will install TensorFlow if not previously installed and build and run the tests.
Just run:
make build && make bundle
For linux, doing it inside the tensorflow/tensorflow:custom-op container is recommended. Note that CircleCI is currently used to build the official pip packages.
We use Circle CI for integration testing and deployment of TF Big.
- update version number in setup.py and push to master; this will build and tests wheels
- iterate 1. until happy with the release, having potentially tested the wheel manually
- when happy, tag a commit with semver label and push; this will build, test, and deploy wheels