当前位置: 首页 > news >正文

Azure动手实验 - 使用Azure Data Factory 迁移数据

该实验使用 Azure CosmosDB,这个实验的点在于:

1:使用了 cosmicworks 生成了实验数据

2:弄清楚cosmosDB 的 accout Name 与 database id 和 container id 关系。

3:创建了 ADF 的连接和任务,让数据从 cosmicworks 数据库的 products 容器,迁移到 cosmicworks数据库的 flatproducts 容器。

实验来自于:练习:使用 Azure 数据工厂迁移现有数据 - Training | Microsoft Learn

Migrate existing data using Azure Data Factory

In Azure Data Factory, Azure Cosmos DB is supported as a source of data ingest and as a target (sink) of data output.

In this lab, we will populate Azure Cosmos DB using a helpful command-line utility and then use Azure Data Factory to move a subset of data from one container to another.

Create and seed your Azure Cosmos DB SQL API account

You will use a command-line utility that creates a cosmicworks database and a products container at 4,000 request units per second (RU/s). Once created, you will adjust the throughput down to 400 RU/s.

To accompany the products container, you will create a flatproducts container manually that will be the target of the ETL transformation and load operation at the end of this lab.

  1. In a new web browser window or tab, navigate to the Azure portal (portal.azure.com).

  2. Sign into the portal using the Microsoft credentials associated with your subscription.

  3. Select + Create a resource, search for Cosmos DB, and then create a new Azure Cosmos DB SQL API account resource with the following settings, leaving all remaining settings to their default values:

    SettingValue
    SubscriptionYour existing Azure subscription
    Resource groupSelect an existing or create a new resource group
    Account NameEnter a globally unique name
    LocationChoose any available region
    Capacity modeProvisioned throughput
    Apply Free Tier DiscountDo Not Apply
    Limit the total amount of throughput that can be provisioned on this accountUnchecked

    📝 Your lab environments may have restrictions preventing you from creating a new resource group. If that is the case, use the existing pre-created resource group.

  4. Wait for the deployment task to complete before continuing with this task.

  5. Go to the newly created Azure Cosmos DB account resource and navigate to the Keys pane.

  6. This pane contains the connection details and credentials necessary to connect to the account from the SDK. Specifically:

    1. Record the value of the URI field. You will use this endpoint value later in this exercise.

    2. Record the value of the PRIMARY KEY field. You will use this key value later in this exercise.

  7. Close your web browser window or tab.

  8. Start Visual Studio Code.

    📝 If you are not already familiar with the Visual Studio Code interface, review the Get Started guide for Visual Studio Code

  9. In Visual Studio Code, open the Terminal menu and then select New Terminal to open a new terminal instance.

  10. Install the cosmicworks command-line tool for global use on your machine.

    dotnet tool install --global cosmicworks

    💡 This command may take a couple of minutes to complete. This command will output the warning message (*Tool 'cosmicworks' is already installed') if you have already installed the latest version of this tool in the past.

  11. Run cosmicworks to seed your Azure Cosmos DB account with the following command-line options:

    OptionValue
    --endpointThe endpoint value you copied earlier in this lab
    --keyThe key value you coped earlier in this lab
    --datasetsproduct
    cosmicworks --endpoint <cosmos-endpoint> --key <cosmos-key> --datasets product

    📝 For example, if your endpoint is: https­://dp420.documents.azure.com:443/ and your key is: fDR2ci9QgkdkvERTQ==, then the command would be: cosmicworks --endpoint https://dp420.documents.azure.com:443/ --key fDR2ci9QgkdkvERTQ== --datasets product

  12. Wait for the cosmicworks command to finish populating the account with a database, container, and items.

  13. Close the integrated terminal.

  14. Close Visual Studio Code.

  15. In a new web browser window or tab, navigate to the Azure portal (portal.azure.com).

  16. Sign into the portal using the Microsoft credentials associated with your subscription.

  17. Select Resource groups, then select the resource group you created or viewed earlier in this lab, and then select the Azure Cosmos DB account resource you created in this lab.

  18. Within the Azure Cosmos DB account resource, navigate to the Data Explorer pane.

  19. In the Data Explorer, expand the cosmicworks database node, expand the products container node, and then select Items.

  20. Observe and select the various JSON items in the products container. These are the items created by the command-line tool used in previous steps.

  21. Select the Scale & Settings node. In the Scale & Settings tab, select Manual, update the required throughput setting from 4000 RU/s to 400 RU/s and then Save your changes**.

  22. In the Data Explorer pane, select New Container.

  23. In the New Container popup, enter the following values for each setting, and then select OK:

    SettingValue
    Database idUse existing | cosmicworks
    Container idflatproducts
    Partition key/category
    Container throughput (autoscale)Manual
    RU/s400
  24. Back in the Data Explorer pane, expand the cosmicworks database node and then observe the flatproducts container node within the hierarchy.

  25. Return to the Home of the Azure portal.

Create Azure Data Factory resource

Now that the Azure Cosmos DB SQL API resources are in place, you will create an Azure Data Factory resource and configure all of the necessary components and connections to perform a one-time data movement from one SQL API container to another to extract data, transform it, and load it to another SQL API container.

  1. Select + Create a resource, search for Data Factory, and then create a new Azure Data Factory resource with the following settings, leaving all remaining settings to their default values:

    SettingValue
    SubscriptionYour existing Azure subscription
    Resource groupSelect an existing or create a new resource group
    NameEnter a globally unique name
    RegionChoose any available region
    VersionV2
    Git configurationConfigure Git later

    📝 Your lab environments may have restrictions preventing you from creating a new resource group. If that is the case, use the existing pre-created resource group.

  2. Wait for the deployment task to complete before continuing with this task.

  3. Go to the newly created Azure Data Factory resource and select Open Azure Data Factory Studio.

    💡 Alternatively, you can navigate to (adf.azure.com/home), select your newly created Data Factory resource, and then select the home icon.

  4. From the home screen. Select the Ingest option to begin the quick wizard to perform a one-time copy data at scale operation and move to the Properties step of the wizard.

  5. Starting with the Properties step of the wizard, in the Task type section, select Built-in copy task.

  6. In the Task cadence or task schedule section, select Run once now and then select Next to move to the Source step of the wizard.

  7. In the Source step of the wizard, in the Source type list, select Azure Cosmos DB (SQL API).

  8. In the Connection section, select + New connection.

  9. In the New connection (Azure Cosmos DB (SQL API)) popup, configure the new connection with the following values, and then select Create:

    SettingValue
    NameCosmosSqlConn
    Connect via integration runtimeAutoResolveIntegrationRuntime
    Authentication methodAccount key | Connection string
    Account selection methodFrom Azure subscription
    Azure subscriptionYour existing Azure subscription
    Azure Cosmos DB account nameYour existing Azure Cosmos DB account name you chose earlier in this lab
    Database namecosmicworks
  10. Back in the Source data store section, within the Source tables section, select Use query.

  11. In the Table name list, select products.

  12. In the Query editor, delete the existing content and enter the following query:

    SELECT 
        p.name, 
        p.categoryName as category, 
        p.price 
    FROM 
        products p
  13. Select Preview data to test the query's validity. Select Next to move to the Target step of the wizard.

  14. In the Target step of the wizard, in the Target type list, select Azure Cosmos DB (SQL API).

  15. In the Connection list, select CosmosSqlConn.

  16. In the Target list, select flatproducts and then select Next to move to the Settings step of the wizard.

  17. In the Settings step of the wizard, in the Task name field, enter FlattenAndMoveData.

  18. Leave all remaining fields to their default blank values and then select Next to move to the final step of the wizard.

  19. Review the Summary of the steps you have selected in the wizard and then select Next.

  20. Observe the various steps in the deployment. When the deployment has finished, select Finish.

  21. Close your web browser window or tab.

  22. In a new web browser window or tab, navigate to the Azure portal (portal.azure.com).

  23. Sign into the portal using the Microsoft credentials associated with your subscription.

  24. Select Resource groups, then select the resource group you created or viewed earlier in this lab, and then select the Azure Cosmos DB account resource you created in this lab.

  25. Within the Azure Cosmos DB account resource, navigate to the Data Explorer pane.

  26. In the Data Explorer, expand the cosmicworks database node, select the flatproducts container node, and then select New SQL Query.

  27. Delete the contents of the editor area.

  28. Create a new SQL query that will return all documents where the name is equivalent to HL Headset:

    SELECT 
        p.name, 
        p.category, 
        p.price 
    FROM
        products p
    WHERE
        p.name = 'HL Headset'
  29. Select Execute Query.

  30. Observe the results of the query.

相关文章:

  • 快速上手几个Linux命令
  • poi-tl实现对Word模板中复杂表格的数据填充
  • Elasticsearch 基本操作
  • java计算机毕业设计企业运营管理系统的设计与实现源程序+mysql+系统+lw文档+远程调试
  • 外卖配送系统搭建,骑手实时更新,路线规划更科学
  • Win10如何安装JDK1.8,最快最详细教程
  • 网站变灰,6行代码,通通变灰
  • HCI OPCDE
  • 异构混合阶多智能体系统编队控制的分布式优化
  • 花钱去IT培训班学习几个月软件测试真的值得吗?
  • TI Lab_SRR学习_1 硬件基础 AWR1642和AWR1642EVM
  • 企业微信小程序上传图片之从微信素材库获取图片并且上传目标服务器
  • Aspose.OCR 22.11.1 for .NET Crack
  • java-net-php-python-springboot家政服务平台计算机毕业设计程序
  • 【Pytorch】第 2 章 :马尔可夫决策过程和动态规划
  • 【微服务】版本确认完成,传统方式安装Nacos
  • 展望未来 | Google Play 与时俱进,奔赴下一个十年
  • Linux网络基础(初级)
  • Akka 学习(三)Actor的基本使用
  • Nginx学习笔记
  • 12款开源数据资产(元数据)管理平台选型分析(一)
  • Node.js入门:Buffer对象学习
  • Spring Cloud Alibaba Sentinel 简介与入门
  • 「自控元件及线路」3.2 三相、两相、单相异步电动机
  • Java高手速成 | JSP的MVC模式
  • 把次卧整出来当办公室
  • MYSQL必知必会笔记:第十二章汇总数据
  • 多线程第三讲
  • 寄存器和移位寄存器分析与建模
  • Learning C++ No.4【类和对象No.3】
  • 2023年重庆高考588分能报什么大学 588分能上哪些院校
  • 2023年山东春季高考考试时间 什么时候考试
  • 单招被调剂可以不去吗 还能高考吗
  • 2023湖南双一流大学名单 湖南哪所学校好
  • 中专考大学要考什么科目 内容有哪些
  • 预计2023国家专项计划录取分数线是多少
  • 2021年浙江工商大学杭州商学院学费是多少 各专业收费标准
  • 神经科学专业课程有哪些
  • 2023软件工程专业课程有哪些 就业方向是什么
  • 现在进行时结构是什么 怎么构成的