Complete phase 2 of the file module migration by replacing the remaining repository-wide legacy imports and deleting the temporary core.file compatibility package introduced in phase 1. What this commit changes - Replace legacy core.file.* imports with core.workflow.file.* across: - controllers - core app/agent/datasource/prompt/rag/tools/variables - factories, fields, libs, models, services - otel parser integration points - unit and integration tests that referenced legacy paths - Migrate residual runtime usages in app/task pipeline paths that still referenced core.file symbols. - Update tests and model serialization helpers that relied on old module paths. - Remove the compatibility bridge package entirely: - delete core/file/__init__.py - delete core/file/constants.py - delete core/file/enums.py - delete core/file/file_manager.py - delete core/file/helpers.py - delete core/file/models.py - delete core/file/tool_file_parser.py Verification - No Python references to core.file remain ( -> empty). - Targeted regression tests for migrated file primitives and factory/type flows passed: - tests/unit_tests/core/test_file.py - tests/unit_tests/factories/test_variable_factory.py - tests/unit_tests/services/test_variable_truncator.py Result - The repository now uses core.workflow.file as the single canonical file namespace. - The migration is fully split into two commits: phase 1 compatibility + phase 2 full cutover.
Dify Backend API
Setup and Run
Important
In the v1.3.0 release,
poetryhas been replaced withuvas the package manager for Dify API backend service.
uv and pnpm are required to run the setup and development commands below.
Using scripts (recommended)
The scripts resolve paths relative to their location, so you can run them from anywhere.
-
Run setup (copies env files and installs dependencies).
./dev/setup -
Review
api/.env,web/.env.local, anddocker/middleware.envvalues (see theSECRET_KEYnote below). -
Start middleware (PostgreSQL/Redis/Weaviate).
./dev/start-docker-compose -
Start backend (runs migrations first).
./dev/start-api -
Start Dify web service.
./dev/start-web -
Set up your application by visiting
http://localhost:3000. -
Optional: start the worker service (async tasks, runs from
api)../dev/start-worker -
Optional: start Celery Beat (scheduled tasks).
./dev/start-beat
Manual commands
Show manual setup and run steps
These commands assume you start from the repository root.
-
Start the docker-compose stack.
The backend requires middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using
docker-compose.cp docker/middleware.env.example docker/middleware.env # Use mysql or another vector database profile if you are not using postgres/weaviate. docker compose -f docker/docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d -
Copy env files.
cp api/.env.example api/.env cp web/.env.example web/.env.local -
Install UV if needed.
pip install uv # Or on macOS brew install uv -
Install API dependencies.
cd api uv sync --group dev -
Install web dependencies.
cd web pnpm install cd .. -
Start backend (runs migrations first, in a new terminal).
cd api uv run flask db upgrade uv run flask run --host 0.0.0.0 --port=5001 --debug -
Start Dify web service (in a new terminal).
cd web pnpm dev:inspect -
Set up your application by visiting
http://localhost:3000. -
Optional: start the worker service (async tasks, in a new terminal).
cd api uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q api_token,dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention -
Optional: start Celery Beat (scheduled tasks, in a new terminal).
cd api uv run celery -A app.celery beat
Environment notes
Important
When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the site’s top-level domain (e.g.,
example.com). The frontend and backend must be under the same top-level domain in order to share authentication cookies.
-
Generate a
SECRET_KEYin the.envfile.bash for Linux
sed -i "/^SECRET_KEY=/c\\SECRET_KEY=$(openssl rand -base64 42)" .envbash for Mac
secret_key=$(openssl rand -base64 42) sed -i '' "/^SECRET_KEY=/c\\ SECRET_KEY=${secret_key}" .env
Testing
-
Install dependencies for both the backend and the test environment
cd api uv sync --group dev -
Run the tests locally with mocked system environment variables in
tool.pytest_envsection inpyproject.toml, more can check Claude.mdcd api uv run pytest # Run all tests uv run pytest tests/unit_tests/ # Unit tests only uv run pytest tests/integration_tests/ # Integration tests # Code quality ./dev/reformat # Run all formatters and linters uv run ruff check --fix ./ # Fix linting issues uv run ruff format ./ # Format code uv run basedpyright . # Type checking