{"id":746,"date":"2022-12-17T13:12:45","date_gmt":"2022-12-17T13:12:45","guid":{"rendered":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/aws-certified-data-analytics-specialty-das-c01-question060\/"},"modified":"2022-12-17T13:12:45","modified_gmt":"2022-12-17T13:12:45","slug":"aws-certified-data-analytics-specialty-das-c01-question060","status":"publish","type":"post","link":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/aws-certified-data-analytics-specialty-das-c01-question060\/","title":{"rendered":"AWS Certified Data Analytics &#8211; Specialty DAS-C01 &#8211; Question060"},"content":{"rendered":"<div class=\"question\">A financial company uses Amazon S3 as its data lake and has set up a data warehouse using a multi-node Amazon Redshift cluster. The data files in the data lake are organized in folders based on the data source of each data file. All the data files are loaded to one table in the Amazon Redshift cluster using a separate COPY command for each data file location. With this approach, loading all the data files into Amazon Redshift takes a long time to complete. Users want a faster solution with little or no increase in cost while maintaining the segregation of the data files in the S3 data lake.<br \/>\nWhich solution meets these requirements?<br \/><strong><br \/>A.<\/strong> Use Amazon EMR to copy all the data files into one folder and issue a COPY command to load the data into Amazon Redshift.<br \/><strong>B.<\/strong> Load all the data files in parallel to Amazon Aurora, and run an AWS Glue job to load the data into Amazon Redshift.<br \/><strong>C.<\/strong> Use an AWS Glue job to copy all the data files into one folder and issue a COPY command to load the data into Amazon Redshift.<br \/><strong>D.<\/strong> Create a manifest file that contains the data file locations and issue a COPY command to load the data into Amazon Redshift.<\/div>\n<p><\/p>\n<style> .hidden-div{ display:none } <\/style>\n<p>\t\t\t\t\t\t\t<button onclick=\"getElementById('hidden-div').style.display = 'block'\"> Show Answer <\/button> <button onclick=\"getElementById('hidden-div').style.display = 'none'\">Hide Answer<\/button><\/p>\n<div class=\"hidden-div\" id=\"hidden-div\"><span style=\"\"><\/p>\n<div class=\"answer\">Correct Answer: <strong>A<\/strong><\/div>\n<p><strong>Explanation:<\/strong> <\/p>\n<div class=\"explanation\">\nReference: <a href=\"https:\/\/docs.aws.amazon.com\/redshift\/latest\/dg\/r_COPY.html\" title=\"External link\" rel=\"nofollow noopener\" target=\"_blank\">https:\/\/docs.aws.amazon.com\/redshift\/latest\/dg\/r_COPY.html<\/a><\/div>\n<p><\/strong><\/span> <\/div>\n","protected":false},"excerpt":{"rendered":"<p>A financial company uses Amazon S3 as its data lake and has set up a data warehouse using a multi-node Amazon Redshift cluster. The data files in the data lake are organized in folders based on the data source of each data file. All the data files are loaded to one table in the Amazon [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[2],"tags":[3,227],"class_list":["post-746","post","type-post","status-publish","format-standard","hentry","category-aws-certified-data-analytics-specialty-das-c01","tag-aws-certified-data-analytics-specialty-das-c01","tag-question-060"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/posts\/746","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/comments?post=746"}],"version-history":[{"count":0,"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/posts\/746\/revisions"}],"wp:attachment":[{"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/media?parent=746"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/categories?post=746"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/exampracticetests.com\/aws\/Data_Analytics-Specialty_DAS-C01\/wp-json\/wp\/v2\/tags?post=746"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}