Upgrading to v1.10
Resources
- dbt Core v1.10 changelog (coming soon)
- dbt Core CLI Installation guide
- Cloud upgrade guide
What to know before upgrading
dbt Labs is committed to providing backward compatibility for all versions 1.x. Any behavior changes will be accompanied by a behavior change flag to provide a migration window for existing projects. If you encounter an error upon upgrading, please let us know by opening an issue.
Starting in 2024, dbt Cloud provides the functionality from new versions of dbt Core via release tracks with automatic upgrades. If you have selected the "Latest" release track in dbt Cloud, you already have access to all the features, fixes, and other functionality that is included in dbt Core v1.10! If you have selected the "Compatible" release track, you will have access in the next monthly "Compatible" release after the dbt Core v1.10 final release.
For users of dbt Core, since v1.8, we recommend explicitly installing both dbt-core
and dbt-<youradapter>
. This may become required for a future version of dbt. For example:
python3 -m pip install <Constant name="core" /> dbt-snowflake
New and changed features and functionality
New features and functionality available in dbt Core v1.10
The --sample
flag
Large data sets can slow down dbt build times, making it harder for developers to test new code efficiently. The --sample
flag, available for the run
and build
commands, helps reduce build times and warehouse costs by running dbt in sample mode. It generates filtered refs and sources using time-based sampling, allowing developers to validate outputs without building entire models.
Integrating dbt Core artifacts with dbt Cloud projects
With hybrid projects, dbt Core users working in the command line interface (CLI) can execute runs that seamlessly upload artifacts into dbt Cloud. This enhances hybrid dbt Core/dbt Cloud deployments by:
- Fostering collaboration between dbt Cloud + dbt Core users by enabling them to visualize and perform cross-project references to models defined in dbt Core projects. This feature unifies dbt Cloud + dbt Core workflows for a more connected dbt experience.
- Giving dbt Cloud and dbt Core users insights into their models and assets in Explorer. To view Explorer, you must have have a developer or read-only license.
- (Coming soon) Enabling users working in the Canvas to build off of models already created by a central data team in dbt Core rather than having to start from scratch.
Hybrid projects are available as a private beta to dbt Cloud Enterprise accounts. Contact your account representative to register your interest in the beta.
Managing changes to legacy behaviors
dbt Core v1.10 introduces new flags for managing changes to legacy behaviors. You may opt into recently introduced changes (disabled by default), or opt out of mature changes (enabled by default), by setting True
/ False
values, respectively, for flags
in dbt_project.yml
.
You can read more about each of these behavior changes in the following links:
- (Introduced, disabled by default)
validate_macro_args
. If the flag is set toTrue
, dbt will raise a warning if the argumenttype
names you've added in your macro YAMLs don't match the argument names in your macro or if the argument types aren't valid according to the supported types.
Deprecation warnings
Starting in v1.10
, you will receive deprecation warnings for dbt code that will become invalid in the future, including:
- Custom inputs (for example, unrecognized resource properties, configurations, and top-level keys)
- Duplicate YAML keys in the same file
- Unexpected jinja blocks (for example,
{% endmacro %}
tags without a corresponding{% macro %}
tag) - And more
dbt will start raising these warnings in version 1.10
, but making these changes will not be a prerequisite for using it. We at dbt Labs understand that it will take existing users time to migrate their projects, and it is not our goal to disrupt anyone with this update. The goal is to enable you to work with more safety, feedback, and confidence going forward.
What does this mean for you?
- If your project (or dbt package) encounters a new deprecation warning in
v1.10
, plan to update your invalid code soon. Although it’s just a warning for now, in a future version, dbt will enforce stricter validation of the inputs in your project. Check out thedbt-cleanup
tool to autofix many of these! - In the future, the
meta
config will be the only place to put custom user-defined attributes. Everything else will be strongly typed and strictly validated. If you have an extra attribute you want to include in your project, or a model config you want to access in a custom materialization, you must nest it undermeta
moving forward. - If you are using the
—-warn-error
flag (or--warn-error-options '{"error": "all"}'
) to promote all warnings to errors, this will include new deprecation warnings coming to dbt Core. If you don’t want these to be promoted to errors, the--warn-error-options
flag gives you more granular control over exactly which types of warnings are treated as errors. You can set"warn": ["Deprecations"]
(new as ofv1.10
) to continue treating the deprecation warnings as warnings.
Custom inputs
Historically, dbt has allowed you to configure inputs largely unconstrained. A common example of this is setting custom YAML properties:
models:
- name: my_model
description: A model in my project.
dbt_is_awesome: true # a custom property
dbt detects the unrecognized custom property (dbt_is_awesome
) and silently continues. Without a set of strictly defined inputs, it becomes challenging to validate your project's configuration. This creates unintended issues such as:
- Silently ignoring misspelled properties and configurations (for example,
desciption:
instead ofdescription:
). - Unintended collisions with user code when dbt introduces a new “reserved” property or configuration.
If you have an unrecognized custom property, you will receive a warning, and in a future version, dbt will cease to support custom properties. Moving forward, these should be nested under the meta
config, which will be the only place to put custom user-defined attributes:
models:
- name: my_model
description: A model in my project.
config:
meta:
dbt_is_awesome: true
Duplicate keys in the same yaml file
If two identical keys exist in the same YAML file, you will get a warning, and in a future version, dbt will stop supporting duplicate keys. Previously, if identical keys existed in the same YAML file, dbt silently overwrite, using the last configuration listed in the file.
my_profile:
target: my_target
outputs:
...
my_profile: # dbt would use only this profile key
target: my_other_target
outputs:
...
Moving forward, you should delete unused keys or move them to a separate YAML file.
Unexpected jinja blocks
If you have an orphaned Jinja block, you will receive a warning, and in a future version, dbt will stop supporting unexpected Jinja blocks. Previously, these orphaned Jinja blocks were silently ignored.
{% endmacro %} # orphaned endmacro jinja block
{% macro hello() %}
hello!
{% endmacro %}
Moving forward, you should delete these orphaned jinja blocks.
Quick hits
-
Provide the
loaded_at_query
property for source freshness to specify custom SQL to generate themaxLoadedAt
time stamp on the source (versus the built-in query, which uses theloaded_at_field
). You cannot defineloaded_at_query
if theloaded_at_field
config is also provided. -
Provide validation for macro arguments using the
validate_macro_args
flag, which is disabled by default. When enabled, this flag checks that documented macro argument names match those in the macro definition and validates their types against a supported format. Previously, dbt did not enforce standard argument types, treating the type field as documentation-only. If no arguments are documented, dbt infers them from the macro and includes them in the manifest.json file. Learn more about supported types.