Hi,
Thank you for the framework. I’m on 0.16.6
.
I noticed from the source code that Kedro DAG is static. That causes some questions.
- I have a data source:
- it is an SQL database that returns names of CSV files by some
DS_ID
- the next part is to fetch CSV files with data. The count varies
I’d like to parametrize DS_ID
at catalog.yml
per each run at production. The only possibility I found is as follows:
- register
TemplatedConfigLoader
at hooks - create
conf/concrete_ds_id/globals.yml
folder on the flight and pass--env concrete_ds_id
to run.
Is there a better way?
I found the ProjectContext._get_pipelines
method. Seems like it can solve the problem. The question how good is an idea to rely on overriding a protected method?
- Next thing. I’d like to compose a pipeline based on fetched CSV files. I’d like to see each of them as an input to a node. But I have no access to catalog items at
ProjectHooks
to create nodes dynamically atregister_pipelines
. How to do that?