Skip to content

[Spark] Temporary tables created in default schema instead of configured schema #2167

@ahurducas

Description

@ahurducas

Describe the bug
When running dbt run --select elementary with Spark (Hive), Elementary attempts to create temporary tables (e.g., dbt_models__tmp_...) in the default schema instead of using the schema specified in dbt_project.yml. This causes a permission error because the user doesn't have CREATE privileges on the default database.

To Reproduce

  1. Configure Elementary in dbt_project.yml with a custom schema:
   models:
     elementary:
       +schema: elementary
       +file_format: "iceberg"
  1. Run dbt run --select elementary
  2. Observe that temporary tables are created in default schema instead of elementary

Expected behavior
All Elementary tables, including temporary/intermediate tables (__tmp_...), should be created in the configured schema (elementary), not in the default schema.

Environment:

  • Elementary CLI (edr) version: N/A
  • Elementary dbt package version: 0.23.0
  • dbt version you're using: 1.9.2
  • Data warehouse: Spark (Hive Metastore)
  • Infrastructure details: Hive with permission restrictions on default database

Additional context
Error log:

Runtime Error in model dbt_models (models/edr/dbt_artifacts/dbt_models.sql)
  Runtime Error
    Error while executing query: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Permission denied: user [myuser] does not have [CREATE] privilege on [default/dbt_models__tmp_20260320073326791798])

The issue appears to be that the __tmp_ table creation logic doesn't inherit or respect the schema configuration, defaulting to default instead.

Would you be willing to contribute a fix for this issue?
Yes

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions