6.5 Parameterize the Source and Target Tables or Objects in IICS DI

preview_player
Показать описание
Parameterize the Source and Target Tables or Objects in IICS DI
Рекомендации по теме
Комментарии
Автор

Hello Ranjan,
Thanks for the explanation.Since the demo is to parameterize the connections and sources/target objects, but why the value for source/target objects are passing/hardcoding during task configuration. And also the same parameter value, it was set again in the parameter file. It should be ultimately passed onlyfrom the parameter file if I'm not wrong. Which value does it take whether from the mapping task that was set or from the param file?

mallikarjun
Автор

Hi, I'm trying to run a mapping that updates to three different target tables. The issue is that the 3 have different columns, I just need to update 1 column on the 3 so it seems I can't use dynamic mapping tasks, what do you suggest?

mauriciomartinez
Автор

Thanks for the quick response. Yes like PowerCenter, in Cloud also Task flow can override the mapping task values or mapping task can override the mapping values. can't we pass/set for example - param_ora_src=$$par_src_conn and at task level?And these values($$par_src_conn, $$par_src_tbl_name) we define in param file. I could see we are passing harcoded object name(i.e.'EMP') and connection name at task level itself and again setting the same valuesin param file.

mallikarjun
Автор

Hi Ranjan, I am trying to create a similar process. But my requirement is I will pass the parameter values while executing a taskflow. I am seeing the job always takes the values assigned in task (I have checked the run time override option).

soumyajitbiswas
Автор

Thanks @itsranjan2003, but want to check one thing .


Scenario :

I want to load 200 multiple source tables from MYsql DB to 200 multiple target tables on snowflake DB using just one single mapping and not do any transformations and run that mapping everyday.

Just run that mapping everyday morning and do CDC on source data to target data.


How is that possible, any idea?

kieees_
Автор

Ranjan can I dynamically create a source with every table name entry made in a metadata table?

thenotorious
Автор

Hi @itsranjan2003, while Parameterize the Source and Target Tables names, in the parameter file can we keep multiple source and target table names

kieees_
Автор

what if I have a source query of the type "select * from schema.table" and I want this schema to be paramterized? I have schema A for dev env and schema b for qat. I am trying to define input param with type as string and but it's not working

ShaileshNavghare-nx
Автор

@ranjan I tried this approach, but when I have field mappings defined, it gets wiped off after parametrizing the object names

dreamraja
Автор

Sir how to compare column and data type of source before executing mapping based on comparison results

guddu
Автор

Hello sir, I would like to migrate multiple tables from terradata into s3 but while selecting source as a multiple i am not able to select all the tables it's only getting selected one table.
I have a requirement of migrating multiple tables through only one mapping task .
Can you please help me sir.

mdsaad-le
Автор

Hi sir,

if we paramterize source tranformation with connection paramter, object parameter, then incoming field won't be there then how can we transform the data in this case if we want to change date format?

is there any we can do both paratmerize and transform the data?

PradeepKanaparthy
Автор

Hi,
if i want to add tranformation logic in parameterized mapping, how will we add?

PradeepKanaparthy
Автор

Can I use the source object parameter in the Query. If possible how to do that.

seeanj