Hi James,

For one thing, It looks to me that we should not configure the credential in 
pom.xml, 
instead, we might introduce a core-site.xml under the classpath and configured 
like

<configuration>
    <property>
        <name>fs.s3a.access.key</name>
        <value><KEY></value>
    </property>

    <property>
        <name>fs.s3a.secret.key</name>
        <value><SECRET></value>
    </property>
</configuration>
I tried with the above method and the attach program and it should work. 

Besides, from the attached exception, there seems also some network problem,
could you try to change the credential configuration and try again ? if there 
are 
still this problem, could you have a look if s3 is accessible?

Best,
Yun
 ------------------Original Mail ------------------
Sender:James Kim <kgh475...@gmail.com>
Send Date:Thu Sep 16 02:24:41 2021
Recipients:Flink ML <user@flink.apache.org>
Subject:Flink S3A failed to connect to service endpoint from IntelliJ IDE
I'm trying to write Java code on IntelliJ IDE to make use of the Table API and 
the data I'm using is going to be from a CSV file over s3a. The IDE project is 
in Maven and has a pom.xml like the following:

<project xmlns="http://maven.apache.org/POM/4.0.0";
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
    <modelVersion>4.0.0</modelVersion>

    <groupId>groupId</groupId>
    <artifactId>flink-ecs-sample</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>8</maven.compiler.source>
        <maven.compiler.target>8</maven.compiler.target>

            <name>fs.s3a.access.key</name>
            <value>myAccessKey</value>

            <name>fs.s3a.secret.key</name>
            <value>mySecretKey</value>

            <name>fs.s3a.endpoint</name>
            <value>myEndPoint</value>

    </properties>



    <dependencies>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-api-java-bridge_2.11</artifactId>
            <version>1.13.2</version>
            <scope>compile</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-planner-blink_2.11</artifactId>
            <version>1.13.2</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-scala_2.11</artifactId>
            <version>1.13.2</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-table-common</artifactId>
            <version>1.13.2</version>
            <scope>compile</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-csv</artifactId>
            <version>1.13.2</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-clients_2.11</artifactId>
            <version>1.13.2</version>
        </dependency>

        <!-- 
https://mvnrepository.com/artifact/org.apache.flink/flink-s3-fs-hadoop -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-s3-fs-hadoop</artifactId>
            <version>1.13.2</version>
        </dependency>

    </dependencies>

</project>

And my Main.java class is as the following:
import org.apache.flink.table.api.EnvironmentSettings;
import org.apache.flink.table.api.Table;
import org.apache.flink.table.api.TableEnvironment;
import org.apache.flink.table.api.TableResult;
import org.apache.flink.types.Row;
import org.apache.flink.util.CloseableIterator;

public class Main {
    public static void main(String[] args) {
        // create a TableEnvironment for batch or streaming execution
        EnvironmentSettings settings = EnvironmentSettings
                .newInstance()
                .inBatchMode()
                .build();

        TableEnvironment tableEnv = TableEnvironment.create(settings);

        // create an input Table
        TableResult tempResult = tableEnv.executeSql(
//                "create temporary table ATHLETES (\n" +
                        "create table ATHLETES (\n" +
                "name varchar,\n" +
                "country varchar,\n" +
                "sport varchar\n" +
                ") with (\n" +
                "'connector' = 'filesystem',\n" +
                
"'path'='s3a://testbucket/james_experiment/2020_Tokyo_Olympics/Athletes.csv',\n"
 +
                "'format'='csv'\n" +
                ")\n");

        TableResult table2 = tableEnv.executeSql("select * from ATHLETES");
}

When I build and run directly from the IDE, I get an error saying a couple 
things
- INFO: Error when creating PropertyDescriptor for public final void 
org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)!
 Ignoring this property.
- com.amazonaws.SdkClientException: Failed to connect to service endpoint: 
- Caused by: java.net.NoRouteToHostException: No route to host (Host 
unreachable)
- Caused by: java.net.SocketTimeoutException: connect timed out

How can I set up a Flink directly on the IntelliJ IDE and fix this issue?

Reply via email to