-LAN- bdf3ea4369 docs(api/README): Remove unnecessary `=` (#5380) 10 months ago
..
.vscode f62f71a81a build: initial support for poetry build tool (#4513) 10 months ago
configs 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
constants 6ccde0452a feat: Added hindi translation i18n (#5240) 10 months ago
controllers 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
core bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 months ago
docker 5f0ce5811a feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) 10 months ago
events d160d1ed02 feat: support opensearch approximate k-NN (#5322) 10 months ago
extensions 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
fields 43c19007e0 fix: workspace member's last_active should be last_active_time, but not last_login_time (#4906) 10 months ago
libs 7305713b97 fix: allow special characters in email (#5327) 10 months ago
migrations 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
models e785cbb81d Fix: multi image preview sign (#5376) 10 months ago
schedule 6c4e6bf1d6 Feat/dify rag (#2528) 1 year ago
services 07387e9586 add the filename length limit (#5326) 10 months ago
tasks ba5f8afaa8 Feat/firecrawl data source (#5232) 10 months ago
templates 3d92784bd4 fix: email template style (#1914) 1 year ago
tests 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
.dockerignore 220f7c81e9 build: fix .dockerignore file (#800) 1 year ago
.env.example 147a39b984 feat: support tencent cos storage (#5297) 10 months ago
Dockerfile 55fc46c707 improvement: speed up dependency installation in docker image rebuilds by mounting cache layer (#3218) 1 year ago
README.md bdf3ea4369 docs(api/README): Remove unnecessary `=` (#5380) 10 months ago
app.py 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
commands.py d160d1ed02 feat: support opensearch approximate k-NN (#5322) 10 months ago
config.py 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) 10 months ago
poetry.lock bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 months ago
poetry.toml f62f71a81a build: initial support for poetry build tool (#4513) 10 months ago
pyproject.toml bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 months ago
requirements-dev.txt 23498883d4 chore: skip explicit installing jinja2 as testing dependency (#4845) 10 months ago
requirements.txt bb33ffc332 feat: initial support for Milvus 2.4.x (#3795) 10 months ago

README.md

Dify Backend API

Usage

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  3. Create environment.

Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

Using pip can be found below.

  1. Install dependencies

    poetry env use 3.10
    poetry install
    

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

   poetry shell                                               # activate current environment
   poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
   poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   poetry run python -m flask db upgrade
  1. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  2. Start Dify web service.

  3. Setup your application by visiting http://localhost:3000...

  4. If you need to debug local async processing, please start the worker service.

    poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail
    

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh
    

Usage with pip

[!NOTE]
In the next version, we will deprecate pip as the primary package management tool for dify api service, currently Poetry and pip coexist.

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
  3. Create environment.

If you use Anaconda, create a new environment and activate it

   conda create --name dify python=3.10
   conda activate dify
  1. Install dependencies

    pip install -r requirements.txt
    
  2. Run migrate

Before the first launch, migrate the database to the latest version.

   flask db upgrade
  1. Start backend:

    flask run --host 0.0.0.0 --port=5001 --debug
    
  2. Setup your application by visiting http://localhost:5001/console/api/setup or other apis...

  3. If you need to debug local async processing, please start the worker service.

    celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail
    

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.