0

I am writing a new app which connects to an old database. For now, I'm reflecting the database objects rather than define them manually in SQLA ORM classes. I've set up my flask app like so:

config = Config.get_config_for_env(env)
app = Flask(name)
app.config.from_object(config)
metadata = MetaData()
db = SQLAlchemy(metadata=metadata)
db.init_app(app)
app.db = db
app.app_context().push()

# Reflect DB
db.metadata.reflect(bind=db.engine, views=True)

The call above, reflects the entire database. My apps normally touch a few tables at a time so it makes sense to reflect database tables lazily. This can be done like so:

db.metadata.reflect(bind=db.engine, schema='MySchema', only=['MyTable'])

To do this, I'll need to insert a layer that ensures before a query is executed, the schema and table have been reflected. This adds complexity as all queries will have to go through another layer of code. Is there a way to reflect database schema+table implicitly on demand as the query is made?

2 Answers 2

0

Well, if know the name of the table you need, then you can do:

table_to_work_with = Table("tablename", db.metadata, bind=db.engine, autoload=True)

And you can use sqlalchemy.inspect to get the table names, as described here: List database tables with SQLAlchemy

Sign up to request clarification or add additional context in comments.

Comments

0

AFAIK, there is no way to do this. It can be done via a test class. Something like this, where self.clone() clones an object:

class TempDbApp(BaseApp):
    def __init__(self, env_src, name='TempDbApp', *args, **kwargs):
        super().__init__('t', name, *args, **kwargs)
        self.env_src = env_src
        self.logger = logging.getLogger(__name__)
        self.schemas = ['dbo']
        self.metadata = MetaData()

    def setup(self):
        super().setup()
        self.test_db_name = self.create_db()

    def teardown(self):
        # Do not drop db at end because pool
        super().teardown()
        self.metadata.drop_all(self.db.engine)
        for schema in self.schemas:
            if schema != 'dbo':
                self.db.engine.execute(DropSchema(schema))
        self.drop_db()

    def create_db(self):
        url = copy(self.db.engine.url)
        engine = create_engine(url, connect_args={'autocommit': True}, isolation_level='AUTOCOMMIT')
        res = engine.execute(f"exec dbo.usp_createShortLivedDB")
        tempdbname = res.fetchone()[0]
        res.close()
        engine.dispose()
        self.db.engine.url.database = tempdbname
        return tempdbname

    def drop_db(self):
        url = copy(self.db.engine.url)
        db = url.database
        url.database = 'master'
        engine = create_engine(url, connect_args={'autocommit': True}, isolation_level='AUTOCOMMIT')

        if database_exists(url):
            assert db != 'master'
            res = engine.execute(f"EXEC dbo.usp_deleteshortliveddb @dbname = '{db}'")
            res.close()

    def fetch_schemas(self):
        results = self.db.engine.execute('SELECT name FROM sys.schemas')
        for schema in results:
            self.schemas.append(schema[0])
        results.close()

    def create_schema(self, schema):
        with self.session() as sess:
            sess.execute(CreateSchema(schema))
        self.schemas.append(schema)

    @lru_cache(maxsize=2048)
    def clone(self, table, schema):
        if schema not in self.schemas:
            self.create_schema(schema)

        self.metadata.reflect(self.engine_src, schema=schema, only=[table])
        self.metadata.drop_all(self.db.engine)  # precautionary in case previous test didn't clean things up
        self.metadata.create_all(self.db.engine)

    @lru_cache(maxsize=2048)
    def get_table(self, table, schema, db=None):
        self.clone(table, schema)
        return super().get_table(table, schema, db)

    @lru_cache(maxsize=2048)
    def get_selectable(self, table, schema, db=None):
        self.clone(table, schema)
        return Table(table, self.db.metadata, schema=schema, autoload=True, autoload_with=self.db.get_engine(bind=db))

    @lazy
    def engine_src(self):
        conn_string = f'mssql+pymssql://user:pass@{self.env_src}-slo-sql-ds/mydb?charset=utf8'
        return create_engine(conn_string, isolation_level='AUTOCOMMIT')

    def start(self):
        raise Exception("You must not call this method - this app is for testing")

Then a test class can be created using multiple inheritance:

@final
class MyRealWorldClassTest(TempDbApp, MyRealWorldClass):
    pass

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.