Posts

Azure Data Engineers

 1. How will you do incremental load using Mapping data flow and without mapping data flow ? 2. What is polybase option does internally in ADF 3. How you execute the activities in for each loop in sequence not parallel ? If you choose to run iterations in parallel, you can limit the number of parallel executions by setting the Batch Count. The default number is 20 and the max number is 50. 4. When one failure occurs in for each loop, how to make it stop not continue next loop ? Add 'IF Condition' activity in the ForEach Loop to skip. 5. When data comes from multiple sources say oracle, netizaa, etc each have 50 tables to load how you will do ? 6. When loading into parquet format how u do partitions in ADf ? 7. How will you choose latest file from blob storage? 8. How you do error handling in ADF ? For Bad rows how ? and for execution error how ? In the dataflow, Connect to Source to ConditonalSplit Activity. Add conditions for Error rows and Valid Rows based on column like Date