tmuife 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
..
.vscode f62f71a81a build: initial support for poetry build tool (#4513) vor 10 Monaten
configs 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
constants 6ccde0452a feat: Added hindi translation i18n (#5240) vor 10 Monaten
controllers 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
core 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
docker 5f0ce5811a feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) vor 10 Monaten
events d160d1ed02 feat: support opensearch approximate k-NN (#5322) vor 10 Monaten
extensions 3cc6093e4b feat: introduce pydantic-settings for config definition and validation (#5202) vor 10 Monaten
fields 92ddb410cd feat: option to hide workflow steps (#5436) vor 10 Monaten
libs 1336b844fd feat(api/auth): switch-to-stateful-authentication (#5438) vor 10 Monaten
migrations 92ddb410cd feat: option to hide workflow steps (#5436) vor 10 Monaten
models 92ddb410cd feat: option to hide workflow steps (#5436) vor 10 Monaten
schedule 6c4e6bf1d6 Feat/dify rag (#2528) vor 1 Jahr
services 1336b844fd feat(api/auth): switch-to-stateful-authentication (#5438) vor 10 Monaten
tasks ba5f8afaa8 Feat/firecrawl data source (#5232) vor 10 Monaten
templates 3d92784bd4 fix: email template style (#1914) vor 1 Jahr
tests 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
.dockerignore 27f0ae8416 build: support Poetry for depencencies tool in api's Dockerfile (#5105) vor 9 Monaten
.env.example 147a39b984 feat: support tencent cos storage (#5297) vor 10 Monaten
Dockerfile 27f0ae8416 build: support Poetry for depencencies tool in api's Dockerfile (#5105) vor 9 Monaten
README.md bdf3ea4369 docs(api/README): Remove unnecessary `=` (#5380) vor 10 Monaten
app.py 1336b844fd feat(api/auth): switch-to-stateful-authentication (#5438) vor 10 Monaten
commands.py d160d1ed02 feat: support opensearch approximate k-NN (#5322) vor 10 Monaten
config.py 65d34ebb96 refactor: extract vdb configs into pydantic-setting based dify configs (#5426) vor 10 Monaten
poetry.lock 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
poetry.toml f62f71a81a build: initial support for poetry build tool (#4513) vor 10 Monaten
pyproject.toml 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten
requirements-dev.txt 23498883d4 chore: skip explicit installing jinja2 as testing dependency (#4845) vor 10 Monaten
requirements.txt 6a09409ec9 Add Oracle23ai as a vector datasource (#5342) vor 9 Monaten

README.md

Dify Backend API

Usage

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  3. Create environment.

Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

Using pip can be found below.

  1. Install dependencies

    poetry env use 3.10
    poetry install
    

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

   poetry shell                                               # activate current environment
   poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
   poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   poetry run python -m flask db upgrade
  1. Start backend

    poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
    
  2. Start Dify web service.

  3. Setup your application by visiting http://localhost:3000...

  4. If you need to debug local async processing, please start the worker service.

    poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail
    

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment

    poetry install --with dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml

    cd ../
    poetry run -C api bash dev/pytest/pytest_all_tests.sh
    

Usage with pip

[!NOTE]
In the next version, we will deprecate pip as the primary package management tool for dify api service, currently Poetry and pip coexist.

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    
  3. Create environment.

If you use Anaconda, create a new environment and activate it

   conda create --name dify python=3.10
   conda activate dify
  1. Install dependencies

    pip install -r requirements.txt
    
  2. Run migrate

Before the first launch, migrate the database to the latest version.

   flask db upgrade
  1. Start backend:

    flask run --host 0.0.0.0 --port=5001 --debug
    
  2. Setup your application by visiting http://localhost:5001/console/api/setup or other apis...

  3. If you need to debug local async processing, please start the worker service.

    celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail
    

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.