With Athena Federated Query, you can run SQL queries across data stored in relational, non-relational, object, and custom data sources. Hence i am going the LAMBDA way to run a query on the ATHENA created table and store the result back to S3 which i can use to create visualizations in AWS quicksight. Second Lambda function (scheduled periodically by Cloudwatch), polls SQS Queue-2; Lambda-2 checks the query execution status from Athena. But querying from the Console itself if very limited. As a wrapper on AWS SDK, Athena-Express bundles the following steps listed on the official AWS Documentation: Initiates a query execution; Keeps checking until the query has finished executing get_athena_query_response ("SELECT * from database.table limit 10000", return_athena_types = True) print (response ['meta']) If you wish to read your SQL query directly into a pandas dataframe you can use the read_sql function. We can’t really do much with the data, and anytime we want to analyse this data, we can’t really sit in front of the console the whole day and run queries manually. Delete message from SQS Queue-2 if status was Success or Failed. A data source connector is a piece of code that can translate between your target data source and Athena. If query state was “Failed” but reason is not “AlreadyExistsException”, then add the message back to SQS Queue-1 response = pydb. Athena uses data source connectors that run on AWS Lambda to run federated queries. You can apply *args or **kwargs into this function which are passed down to pd.read_csv(). I am trying to build Lambda function with python. athena-express simplifies integrating Amazon Athena with any Node.JS application - running as a standalone application or as a Lambda function. The Python function runtime gets invocation events from Lambda and passes them to the handler. Since we already know about AWS Athena lets try to integrate that code with Lambda so as we can query Athena using a Lambda and can get the results. On the Lambda tab, select the Lambda functions corresponding to the Athena federated connectors that Athena federated queries use. The test command will start the specified task (in our case run_query) from a given DAG (simple_athena_query … Everything will be executed using infrastructure as code from our Serverless Framework project. Have tried like but of no use. In the last post, we saw how to query data from S3 using Amazon Athena in the AWS Console. Discussion Forums > Category: Analytics > Forum: Amazon Athena > Thread: Querying Athena from Python Search Forum : Advanced search options Querying Athena from Python If you followed the post Extracting and joining data from multiple data sources with Athena Federated Query when configuring your Athena federated connectors, you can select dynamo , hbase , mysql , and redis . Use the examples in this topic as a starting point for writing Athena applications using the SDK for Java 2.x. For more information about running the Java code examples, see the Amazon Athena Java Readme on the AWS Code Examples Repository on GitHub. In the function configuration, the handler value is lambda_function.lambda_handler. Example. I am able to access body part using like but not able to access other request or header parameter . airflow test simple_athena_query run_query 2019–05–21. The second Lambda will create a new SQL query with the name provided in the query parameters and then query the product list using Athena. Saving a Product to S3 I am trying to access header, request ,query parameter passed as a part of request . The first Lambda will create a new object and store it as JSON in an S3 bucket.