Posts

Showing posts from December, 2019

SQL Joins

Image
Joins are used to combine rows from two or more tables based on related columns between them. Joins allow you to retrieve data from multiple tables simultaneously, enabling you to create complex queries that fetch data from different sources. There are different types of joins in SQL, including: INNER JOIN Returns only the rows that have matching values in both tables based on the specified join condition. It discards non-matching rows from both tables. Example:           create table t1(x int); insert into t1 values(1); insert into t1 values(1); insert into t1 values(0); create table t2(y int); insert into t2 values(0); insert into t2 values(1); insert into t2 values(1);           select * from t1 inner join t2 on t1.x = t2.y Output: 2. LEFT JOIN (or) LEFT OUTER JOIN Returns all the rows from the left (or first) table and the matching rows from the right (or second) table. If there is no match, NULL values are

Airflow basic custom web view

Folder Structure Plugin |_test_plugin    |_templates       |_test.html    test_plugin.py test_plugin.py from airflow.plugins_manager import AirflowPlugin from flask import Blueprint from flask_admin import BaseView, expose from flask_admin.base import MenuLink class TestView(BaseView):     @expose('/')     def test(self):              return self.render("test.html", content="Hello galaxy!") v = TestView(category="Test Plugin", name="Test View") blue_print_ = Blueprint("test_plugin",                         __name__,                         template_folder='templates') class AirflowTestPlugin(AirflowPlugin):     name = "MenuLinks"     # operators = []     flask_blueprints = [blue_print_]     # hooks = []     # executors = []     admin_views = [v]      #appbuilder_views = [v_appbuilder_package]  

Airflow dynamic dag creation and chain sequentially

DAG1: import airflow.utils.dates import airflow.utils.helpers from airflow import DAG from airflow.operators.dagrun_operator import TriggerDagRunOperator from airflow.operators.python_operator import PythonOperator from airflow.operators.sensors import ExternalTaskSensor from airflow.models.variable import Variable from airflow.utils.state import State from datetime import datetime, timedelta import dateutil.parser dag = DAG(     dag_id="dag1",     default_args={"owner": "airflow", "start_date": airflow.utils.dates.days_ago(2)},     schedule_interval="@once",      catchup=False ) def trigger_dag_with_context(context, dag_run_obj):        dag_run_obj.payload = {'job': context['params']['job']}     return dag_run_obj l1 = [] l2 = [] i = 0 for job in ["job1", "job2"]:      trigger = TriggerDagRunOperator(         task_id= job + "_dag",         trigger