Skip to content

Commit 1c6c7b6

Browse files
committed
Use Markdown readme instead or RST to use native mermaid diagrams
1 parent 3222248 commit 1c6c7b6

File tree

6 files changed

+227
-229
lines changed

6 files changed

+227
-229
lines changed

Makefile

Lines changed: 0 additions & 2 deletions
This file was deleted.

README.md

Lines changed: 226 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,226 @@
1+
# django-s3file
2+
3+
A lightweight file upload input for Django and Amazon S3.
4+
5+
Django-S3File allows you to upload files directly AWS S3 effectively
6+
bypassing your application server. This allows you to avoid long running
7+
requests from large file uploads. This is particularly helpful for if
8+
you run your service on AWS Lambda or Heroku where you have a hard
9+
request limit.
10+
11+
[![PyPi
12+
Version](https://img.shields.io/pypi/v/django-s3file.svg)](https://pypi.python.org/pypi/django-s3file/)
13+
[![Test
14+
Coverage](https://codecov.io/gh/codingjoe/django-s3file/branch/master/graph/badge.svg)](https://codecov.io/gh/codingjoe/django-s3file)
15+
[![GitHub
16+
license](https://img.shields.io/badge/license-MIT-blue.svg)](https://raw.githubusercontent.com/codingjoe/django-s3file/master/LICENSE)
17+
18+
## Features
19+
20+
- lightweight: less 200 lines
21+
- no JavaScript or Python dependencies (no jQuery)
22+
- easy integration
23+
- works just like the built-in
24+
- extendable JavaScript API
25+
26+
## For the Nerds
27+
28+
```mermaid
29+
sequenceDiagram
30+
autonumber
31+
actor Browser
32+
Browser->>S3: POST large file
33+
activate S3
34+
S3->>Browser: RESPONSE AWS S3 key
35+
Browser->>Middleware: POST AWS S3 key
36+
activate Middleware
37+
Middleware->>S3: GET AWS S3 key
38+
S3->>Middleware: RESPONSE large file promise
39+
deactivate S3
40+
Middleware->>Django: request incl. large file promise
41+
deactivate Middleware
42+
activate Django
43+
opt only if files is procssed by Django
44+
Django-->>S3: GET large file
45+
activate S3
46+
S3-->>Django: RESPONSE large file
47+
deactivate S3
48+
end
49+
Django->>Browser: RESPONSE success
50+
deactivate Django
51+
```
52+
53+
## Installation
54+
55+
Make sure you have [Amazon S3
56+
storage](http://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html)
57+
setup correctly.
58+
59+
Just install S3file using `pip`.
60+
61+
```bash
62+
pip install django-s3file
63+
# or
64+
pipenv install django-s3file
65+
```
66+
67+
Add the S3File app and middleware in your settings:
68+
69+
```python
70+
# settings.py
71+
72+
INSTALLED_APPS = (
73+
'...',
74+
's3file',
75+
'...',
76+
)
77+
78+
MIDDLEWARE = (
79+
'...',
80+
's3file.middleware.S3FileMiddleware',
81+
'...',
82+
)
83+
```
84+
85+
## Usage
86+
87+
S3File automatically replaces Django's `ClearableFileInput` widget, you
88+
do not need to alter your code at all.
89+
90+
The `ClearableFileInput` widget is only than automatically replaced when
91+
the `DEFAULT_FILE_STORAGE` setting is set to `django-storages`'
92+
`S3Boto3Storage` or the dummy `FileSystemStorage` is enabled.
93+
94+
### Setting up the AWS S3 bucket
95+
96+
#### Upload folder
97+
98+
S3File uploads to a single folder. Files are later moved by Django when
99+
they are saved to the `upload_to` location.
100+
101+
It is recommended to [setup
102+
expiration](http://docs.aws.amazon.com/AmazonS3/latest/dev/intro-lifecycle-rules.html)
103+
for that folder, to ensure that old and unused file uploads don't add up
104+
and produce costs.
105+
106+
The default folder name is: `tmp/s3file` You can change it by changing
107+
the `S3FILE_UPLOAD_PATH` setting.
108+
109+
#### CORS policy
110+
111+
You will need to allow `POST` from all origins. Just add the following
112+
to your CORS policy.
113+
114+
```json
115+
[
116+
{
117+
"AllowedHeaders": [
118+
"*"
119+
],
120+
"AllowedMethods": [
121+
"POST"
122+
],
123+
"AllowedOrigins": [
124+
"*"
125+
],
126+
"ExposeHeaders": [],
127+
"MaxAgeSeconds": 3000
128+
}
129+
]
130+
```
131+
132+
### Progress Bar
133+
134+
S3File does emit progress signals that can be used to display some kind
135+
of progress bar. Signals named `progress` are emitted for both each
136+
individual file input as well as for the form as a whole.
137+
138+
The progress signal carries the following details:
139+
140+
```javascript
141+
console.log(event.detail)
142+
143+
{
144+
progress: 0.4725307607171312 // total upload progress of either a form or single input
145+
loaded: 1048576 // total upload progress of either a form or single input
146+
total: 2219064 // total bytes to upload
147+
currentFile: File {…} // file object
148+
currentFileName: "text.txt" // file name of the file currently uploaded
149+
currentFileProgress: 0.47227834703299176 // upload progress of that file
150+
originalEvent: ProgressEvent {…} // the original XHR onprogress event
151+
}
152+
```
153+
154+
The following example implements a Boostrap progress bar for upload
155+
progress of an entire form.
156+
157+
```html
158+
<div class="progress">
159+
<div class="progress-bar" role="progressbar" style="width: 0%;" aria-valuenow="0" aria-valuemin="0" aria-valuemax="100">0%</div>
160+
</div>
161+
```
162+
163+
```javascript
164+
(function () {
165+
var form = document.getElementsByTagName('form')[0]
166+
var progressBar = document.getElementsByClassName('progress-bar')[0]
167+
168+
form.addEventListener('progress', function (event) {
169+
// event.detail.progress is a value between 0 and 1
170+
var percent = Math.round(event.detail.progress * 100)
171+
172+
progressBar.setAttribute('style', 'width:' + percent + '%')
173+
progressBar.setAttribute('aria-valuenow', percent)
174+
progressBar.innerText = percent + '%'
175+
})
176+
})()
177+
```
178+
179+
### Using S3File in development
180+
181+
Using S3File in development can be helpful especially if you want to use
182+
the progress signals described above. Therefore, S3File comes with a AWS
183+
S3 dummy backend. It behaves similar to the real S3 storage backend. It
184+
is automatically enabled, if the `DEFAULT_FILE_STORAGE` setting is set
185+
to `FileSystemStorage`.
186+
187+
To prevent users from accidentally using the `FileSystemStorage` and the
188+
insecure S3 dummy backend in production, there is also an additional
189+
deployment check that will error if you run Django\'s deployment check
190+
suite:
191+
192+
```shell
193+
python manage.py check --deploy
194+
```
195+
196+
We recommend always running the deployment check suite as part of your
197+
deployment pipeline.
198+
199+
### Uploading multiple files
200+
201+
Django does have limited support for [uploading multiple
202+
files](https://docs.djangoproject.com/en/stable/topics/http/file-uploads/#uploading-multiple-files).
203+
S3File fully supports this feature. The custom middleware makes ensure
204+
that files are accessible via `request.FILES`, even though they have
205+
been uploaded to AWS S3 directly and not to your Django application
206+
server.
207+
208+
### Using optimized S3Boto3Storage
209+
210+
Since `S3Boto3Storage` supports storing data from any other fileobj, it
211+
uses a generalized `_save` function. This leads to the frontend
212+
uploading the file to S3 and then copying it byte-by-byte to perform a
213+
move operation just to rename the uploaded object. For large files this
214+
leads to additional loading times for the user.
215+
216+
That\'s why S3File provides an optimized version of this method at
217+
`storages_optimized.S3OptimizedUploadStorage`. It uses the more
218+
efficient `copy` method from S3, given that we know that we only copy
219+
from one S3 location to another.
220+
221+
```python
222+
from s3file.storages_optimized import S3OptimizedUploadStorage
223+
224+
class MyStorage(S3OptimizedUploadStorage): # Subclass and use like any other storage
225+
default_acl = 'private'
226+
```

0 commit comments

Comments
 (0)