<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>OpenText Analytics Database 26.2.x – Loading data from Amazon S3 using MC</title>
    <link>/en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/</link>
    <description>Recent content in Loading data from Amazon S3 using MC on OpenText Analytics Database 26.2.x</description>
    <generator>Hugo -- gohugo.io</generator>
    
	  <atom:link href="/en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/index.xml" rel="self" type="application/rss+xml" />
    
    
      
        
      
    
    
    <item>
      <title>Mc: About configuring a data load from S3</title>
      <link>/en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/about-configuring-data-load-from-s3/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      
      <guid>/en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/about-configuring-data-load-from-s3/</guid>
      <description>
        
        
        &lt;p&gt;When you create an S3 Data Load using MC, you have the option of further configuring the load operation. You can optionally specify the following:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href=&#34;#Add&#34;&gt;Add COPY Parameters&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href=&#34;#Capture&#34;&gt;Capture Rejected Data in a Table&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href=&#34;#Set&#34;&gt;Set a Rejected Records Maximum&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a name=&#34;Add&#34;&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2 id=&#34;add-copy-parameters&#34;&gt;Add COPY parameters&lt;/h2&gt;
&lt;p&gt;MC performs the load operation with &lt;a href=&#34;../../../../../en/sql-reference/statements/copy/#&#34;&gt;COPY&lt;/a&gt;. You can use the COPY Parameters field to further configure the COPY operation. This field accepts parameters that are specified after the COPY statement&#39;s FROM clause. For details on these parameters and special requirements, see &lt;a href=&#34;../../../../../en/sql-reference/statements/copy/parameters/#&#34;&gt;Parameters&lt;/a&gt;.

&lt;div class=&#34;alert admonition note&#34; role=&#34;alert&#34;&gt;
&lt;h4 class=&#34;admonition-head&#34;&gt;Note&lt;/h4&gt;

The FILTER and PARSER parameters must appear in that order and precede all other parameters.

&lt;/div&gt;&lt;/p&gt;
&lt;p&gt;For example, you can specify the DELIMITER and SKIP parameters to separate columns with a comma, and skip one record of input data, respectively:&lt;br /&gt;&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;DELIMITER &amp;#39;,&amp;#39; SKIP 1
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;You can also add comments in this field with &lt;a href=&#34;https://www.techonthenet.com/c_language/comments.php&#34;&gt;standard C comment notation&lt;/a&gt;.

&lt;div class=&#34;alert admonition note&#34; role=&#34;alert&#34;&gt;
&lt;h4 class=&#34;admonition-head&#34;&gt;Note&lt;/h4&gt;

This field does not support SQL comment notation (double hyphen --).

&lt;/div&gt;&lt;/p&gt;
&lt;p&gt;&lt;a name=&#34;Capture&#34;&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2 id=&#34;capture-rejected-data-in-a-table&#34;&gt;Capture rejected data in a table&lt;/h2&gt;
&lt;p&gt;Set &lt;strong&gt;Capture rejected data in a table&lt;/strong&gt; to Yes to create a table that contains rejected row data. You can view this data in the Load History tab.&lt;/p&gt;
&lt;p&gt;This table uses the following naming convention:&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;&lt;span class=&#34;code-variable&#34;&gt;schema&lt;/span&gt;.s3_load_rejections_&lt;span class=&#34;code-variable&#34;&gt;target-table-name&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;You must have CREATE privilege on the schema if the table doesn&#39;t already exist. When you invoke multiple load processes for the same target table, MC appends all rejections data to the same table. For details, see &lt;a href=&#34;https://vertica.com/docs/7.2.x/HTML/index.htm#Authoring/AdministratorsGuide/BulkLoadCOPY/SavingRejectionsTable.htm&#34;&gt;&lt;a href=&#34;../../../../../en/data-load/handling-messy-data/saving-rejected-data-to-table/#&#34;&gt;Saving rejected data to a table&lt;/a&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;a name=&#34;Set&#34;&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2 id=&#34;set-a-rejected-records-maximum&#34;&gt;Set a rejected records maximum&lt;/h2&gt;
&lt;p&gt;Set &lt;strong&gt;Reject Max&lt;/strong&gt; to the maximum number of rows that can be rejected before the load operation fails. If COPY rejects the specified maximum rows, OpenText™ Analytics Database rolls back the entire load operation.&lt;/p&gt;
&lt;h2 id=&#34;see-also&#34;&gt;See also&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;a href=&#34;../../../../../en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/#&#34;&gt;Loading data from Amazon S3 using MC&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href=&#34;../../../../../en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/viewing-load-history/#&#34;&gt;Viewing load history&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

      </description>
    </item>
    
    <item>
      <title>Mc: Viewing load history</title>
      <link>/en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/viewing-load-history/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      
      <guid>/en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/viewing-load-history/</guid>
      <description>
        
        
        &lt;p&gt;You can view a history of all your continuous and instance loading jobs in OpenText™ Analytics Database on the Data Load Activity page.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Continuous jobs:&lt;/strong&gt; Loading jobs that continuously monitor a source and stream data from the source.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Instance jobs:&lt;/strong&gt; Loading jobs that batch load from a source. Instance jobs are of a fixed length and shorter-term than continuous loads.&lt;/p&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id=&#34;view-continuous-loads&#34;&gt;View continuous loads&lt;/h2&gt;
&lt;p&gt;The Continuous tab on the Data Load Activity page displays history of your database’s continuous loading jobs. For example, you can see loading jobs you create using the database integration with Kafka (see &lt;a href=&#34;../../../../../en/kafka-integration/#&#34;&gt;Apache Kafka integration&lt;/a&gt;). Additionally, if you enable the MC extended monitoring feature, the Continuous tab displays the continuous jobs that stream data from your monitored database to a storage database. (See &lt;a href=&#34;../../../../../en/mc/monitoring-using-mc/extended-monitoring/#&#34;&gt;Extended monitoring&lt;/a&gt; for more on how MC can use Kafka to monitor databases externally.)&lt;/p&gt;
&lt;p&gt;Use the Continuous tab to view details about continuous jobs, such as their source, target tables, and other microbatch configuration details.&lt;/p&gt;
&lt;p&gt;If extended monitoring is enabled, jobs streaming to the MC storage database show mc_dc_kafka_config as the scheduler name. Deselect &lt;strong&gt;Show MC data collector monitoring streams&lt;/strong&gt; at the top of the tab to remove these jobs from the display.&lt;/p&gt;
&lt;p&gt;In the Continuous tab, click the labels in the &lt;strong&gt;Scheduler&lt;/strong&gt;, &lt;strong&gt;Microbatch&lt;/strong&gt;, and &lt;strong&gt;Errors Last Hour&lt;/strong&gt; to view additional details about those loading jobs.&lt;/p&gt;
&lt;p&gt;&lt;img src=&#34;../../../../../images/mc/mc-load-continuous.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;
&lt;p&gt;For more on continuous data streaming terminology, see &lt;a href=&#34;../../../../../en/kafka-integration/data-streaming-integration-terms/#&#34;&gt;Data streaming integration terms&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;view-load-instances&#34;&gt;View load instances&lt;/h2&gt;
&lt;p&gt;In the Instance tab, you can see a history of your database&#39;s one-time loading jobs. For example, you can view instance jobs you created using the COPY command in vsql (see &lt;a href=&#34;../../../../../en/sql-reference/statements/copy/#&#34;&gt;COPY&lt;/a&gt;), or instance jobs you created in MC to copy data from an Amazon S3 bucket. (For more about initiating loading jobs in MC, see &lt;a href=&#34;../../../../../en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/#&#34;&gt;Loading data from Amazon S3 using MC&lt;/a&gt;.)&lt;/p&gt;
&lt;p&gt;In the Instance tab, click the labels in the Status column and Rejected Rows column to view more details about completed jobs. For more about rejected rows, see &lt;a href=&#34;https://vertica.com/docs/7.2.x/HTML/index.htm#Authoring/AdministratorsGuide/BulkLoadCOPY/CapturingLoadExceptionsAndRejections.htm&#34;&gt;&lt;a href=&#34;../../../../../en/data-load/handling-messy-data/#&#34;&gt;Handling messy data&lt;/a&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;img src=&#34;../../../../../images/mc/mc-load-instance.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;
&lt;p&gt;The number of load history results on the Instance tab depends on the &lt;a href=&#34;https://vertica.com/docs/7.2.x/HTML#Authoring/Glossary/DataCollector.htm&#34;&gt;&lt;a href=&#34;../../../../../en/glossary/data-collector/#&#34;&gt;Data collector&lt;/a&gt;&lt;/a&gt; retention policy for Requests Issued and Requests Completed. To change the retention policy, see &lt;a href=&#34;https://vertica.com/docs/7.2.x/HTML#Authoring/AdministratorsGuide/Monitoring/Vertica/ConfiguringDataRetentionPolicies.htm&#34;&gt;&lt;a href=&#34;../../../../../en/admin/monitoring/data-collector-utility/configuring-data-retention-policies/#&#34;&gt;Configuring data retention policies&lt;/a&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id=&#34;see-also&#34;&gt;See also&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;a href=&#34;../../../../../en/mc/cloud-platforms/aws-mc/loading-data-from-amazon-s3-using-mc/#&#34;&gt;Loading data from Amazon S3 using MC&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href=&#34;../../../../../en/kafka-integration/#&#34;&gt;Apache Kafka integration&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href=&#34;../../../../../en/kafka-integration/&#34;&gt;&lt;a href=&#34;../../../../../en/kafka-integration/data-streaming-integration-terms/#&#34;&gt;Data streaming integration terms&lt;/a&gt; &lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href=&#34;../../../../../en/sql-reference/statements/copy/#&#34;&gt;COPY&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

      </description>
    </item>
    
  </channel>
</rss>
