Skip to content

Commit eec6905

Browse files
authored
Merge pull request #1277 from Sage-Bionetworks/fix-link
Fix link in the documentation
2 parents 96cb6c2 + 7248bc6 commit eec6905

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

docs/tutorials/python/file.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ In this tutorial you will:
4646
!!! warning "Uploading Large Files"
4747
If you are uploading very large files (>100 GB each), consider using **sequential uploads with async API** instead.
4848

49-
For large file uploads, see the `execute_walk_file_sequential()` function in [uploadBenchmark.py](https://github.com/Sage-Bionetworks/synapsePythonClient/blob/develop/docs/scripts/#L286) as a reference implementation. This approach uses `asyncio.run(file.store_async())` with the newer async API, which has been optimized for handling very large files efficiently. In benchmarks, this pattern successfully uploaded 45 files of 100 GB each (4.5 TB total) in approximately 20.6 hours.
49+
For large file uploads, see the `execute_walk_file_sequential()` function in [uploadBenchmark.py](https://github.com/Sage-Bionetworks/synapsePythonClient/blob/develop/docs/scripts/uploadBenchmark.py#L286) as a reference implementation. This approach uses `asyncio.run(file.store_async())` with the newer async API, which has been optimized for handling very large files efficiently. In benchmarks, this pattern successfully uploaded 45 files of 100 GB each (4.5 TB total) in approximately 20.6 hours.
5050

5151
#### First let's retrieve all of the Synapse IDs we are going to use
5252
```python

0 commit comments

Comments
 (0)