FamousMai 20f73cb756 fix: default model set wrong(#6327) (#6332) vor 9 Monaten
..
configs 7c397f5722 update celery beat scheduler time to env (#6352) vor 9 Monaten
constants 6ef401a9f0 feat:add tts-streaming config and future (#5492) vor 10 Monaten
controllers 7943f7f697 chore: fix legacy API usages of Query.get() by Session.get() in SqlAlchemy 2 (#6340) vor 9 Monaten
core 4e2fba404d WebscraperTool bypass cloudflare site by cloudscraper (#6337) vor 9 Monaten
docker cb09dbef66 feat: correctly delete applications using Celery workers (#5787) vor 10 Monaten
events d320d1468d Feat/delete file when clean document (#5882) vor 9 Monaten
extensions 7c397f5722 update celery beat scheduler time to env (#6352) vor 9 Monaten
fields 9622fbb62f feat: app rate limit (#5844) vor 10 Monaten
libs 9622fbb62f feat: app rate limit (#5844) vor 10 Monaten
migrations 9622fbb62f feat: app rate limit (#5844) vor 10 Monaten
models 7943f7f697 chore: fix legacy API usages of Query.get() by Session.get() in SqlAlchemy 2 (#6340) vor 9 Monaten
schedule 1df71ec64d refactor(api): switch to dify_config with Pydantic in controllers and schedule (#6237) vor 9 Monaten
services 20f73cb756 fix: default model set wrong(#6327) (#6332) vor 9 Monaten
tasks d320d1468d Feat/delete file when clean document (#5882) vor 9 Monaten
templates 00b4cc3cd4 feat: implement forgot password feature (#5534) vor 10 Monaten
tests 63e34e5227 feat: support MyScale vector database (#6092) vor 9 Monaten
.dockerignore 27f0ae8416 build: support Poetry for depencencies tool in api's Dockerfile (#5105) vor 10 Monaten
.env.example 7c397f5722 update celery beat scheduler time to env (#6352) vor 9 Monaten
Dockerfile 9b7c74a5d9 chore: skip pip upgrade preparation in api dockerfile (#5999) vor 10 Monaten
README.md 2d6624cf9e typo: Update README.md (#5987) vor 10 Monaten
app.py d7f75d17cc Chore/remove-unused-code (#5917) vor 10 Monaten
commands.py 7c70eb87bc feat: support AnalyticDB vector store (#5586) vor 10 Monaten
poetry.lock 4e2fba404d WebscraperTool bypass cloudflare site by cloudscraper (#6337) vor 9 Monaten
poetry.toml f62f71a81a build: initial support for poetry build tool (#4513) vor 10 Monaten
pyproject.toml 4e2fba404d WebscraperTool bypass cloudflare site by cloudscraper (#6337) vor 9 Monaten

README.md

Dify Backend API

Usage

[!IMPORTANT] In the v0.6.12 release, we deprecated pip as the package management tool for Dify API Backend service and replaced it with poetry.

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   cp middleware.env.example middleware.env
   docker compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  3. Create environment.

Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

  1. Install dependencies

    poetry env use 3.10
    poetry install
    

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

   poetry shell                                               # activate current environment
   poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
   poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   poetry run python -m flask db upgrade
  1. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  2. Start Dify web service.

  3. Setup your application by visiting http://localhost:3000...

  4. If you need to debug local async processing, please start the worker service.

    poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
    

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh