-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ray Cluster Error at tokenizing documents #667
Comments
A few other notes:
|
i'm really confused. tests pass locally and i was able to run a job to completion. Can you download /tmp/ray/session_latest/logs/ |
I am using Also, I'm using shuffle buffer of 100000. This is the error I'm getting: These links might be helpful: Maybe we can just spill to GCP or some other store instead of using TPUs?
|
sure spilling to GCS sounds good. want to try that? |
Kiloshard would have fixed this probably actually |
(really we just need better back pressure) |
tests/test_tokenized_document_cache.py
fails attest_doc_cache_reproduces_data_one_batch_per_shard
(see below);Below is log from the unit test:
The text was updated successfully, but these errors were encountered: