Table Definition
We define tables in Iceaxe using the TableBase class. You define the column
type by attaching a Python typehint to the class attributes. You can further configure
these columns by customizing a Field.
from iceaxe import TableBase, Field, PostgresDateTime, DBConnection
class Employee(TableBase):
id: int | None = Field(primary_key=True, default=None)
name: str
age: int
payload: datetime = Field(postgres_config=PostgresDateTime(timezone=True))
Right now this table just exists in Python. To create it in the database, we need to issue the instructions to create the table. You can manually write this SQL or rely on our harness to generate it for you.
from iceaxe.schemas.cli import create_all
conn = DBConnection(
await asyncpg.connect(
host="localhost",
port=5432,
user="db_user",
password="yoursecretpassword",
database="your_db",
)
)
await create_all(conn)
By default, create_all will create all tables that are subclasses of TableBase that
are currently imported into your Python environment. Make sure all the models that you
use are imported into thet namespace before you call create_all. One convenient convention
to follow is to define your schemas in a models folder and then re-export them from
the models/__init__.py file.
myproject/
├── __init__.py
└── models/
├── __init__.py
├── employee.py
└── office.py
# models/__init__.py
from .employee import Employee as Employee
from .office import Office as Office
This allows you to import the one models module and have all of your models available with that one import:
from myproject import models
print(models.Employee)
print(models.Office)
await create_all(conn)
You can also limit the models that are created by passing a list of explicit models to
the models argument:
await create_all(conn, models=[DemoCustomModel])
Fields
This TableBase is a subclass of Pydantic's BaseModel, which means that
you can use Pydantic's validation and serialization features on your
tables in addition to our database specific flags. You can read more
about the Pydantic features in the Pydantic documentation. We
define our full database options in the Iceaxe API docs.
To use one example, let's say that you want to validate that the age is at least 16. Add a new field validator for that particular field:
from iceaxe import TableBase, Field
from pydantic import field_validator
class Employee(TableBase):
id: int | None = Field(primary_key=True, default=None)
name: str
age: int
@field_validator("age")
def validate_age(cls, v):
if v < 16:
raise ValueError("You must be at least 16 to receive a tax return")
return v
# Successful validation
person_1 = Person(name="John Doe", age=16)
print(person_1)
# Raises a pydantic_core._pydantic_core.ValidationError
person_2 = Person(name="John Doe", age=15)
print(person_2)
Structured JSON fields
Field(is_json=True) also supports structured values, including Pydantic models. Iceaxe will infer
a JSON column for the field and deserialize rows back into the annotated Python type:
from pydantic import BaseModel
class Preferences(BaseModel):
theme: str
notifications: bool
class Employee(TableBase):
id: int | None = Field(primary_key=True, default=None)
name: str
preferences: Preferences = Field(is_json=True)
When you insert, select, update, or refresh an Employee, the preferences attribute remains a
Preferences instance instead of a raw dict. If you prefer JSONB, you can still override the
storage type with Field(is_json=True, explicit_type=ColumnType.JSONB) after importing
ColumnType from iceaxe.sql_types.
Simple scalar subclasses
Iceaxe also accepts simple subclasses of built-in storage types. This is useful when you want domain-specific types without giving up ORM support:
from uuid import UUID
class EmployeeId(UUID):
pass
class Employee(TableBase):
id: EmployeeId = Field(primary_key=True)
name: str
The value is stored using the base PostgreSQL type, but reads come back as EmployeeId instances
for both full-model selects and direct column selects. This works for simple subclasses of
UUID, str, int, float, bool, bytes, date, time, datetime, and timedelta.