Skip to main content

JSON

Every database supports JSON — PostgreSQL has json/jsonb, MySQL and MariaDB have JSON, DuckDB has JSON, SQL Server has FOR JSON, and Oracle has its own JSON type. Foundations gives you a unified way to work with JSON across all of them: your RowCodec doubles as a JSON codec with zero extra code.

JSON-Encoded Column Types

Pass a RowCodec to your database's jsonArrayEncoded or jsonObjectEncoded method to get a column type that reads and writes structured rows as JSON. An unnamed codec can produce positional JSON arrays ([value1, value2, ...]), while a named codec can also produce keyed JSON objects ({"column": value, ...}):

data class OrderLine(val product: String, val qty: Int, val price: BigDecimal)

val lineCodec: RowCodecNamed<OrderLine> =
RowCodec.namedBuilder<OrderLine>()
.field("product", DuckDbTypes.varchar, OrderLine::product)
.field("qty", DuckDbTypes.integer, OrderLine::qty)
.field("price", DuckDbTypes.decimal(10, 2), OrderLine::price)
.build(::OrderLine)

// Stores rows as positional JSON arrays: [["Widget", 3, 9.99], ...]
val arrayType: DuckDbType<List<OrderLine>> =
DuckDbTypes.jsonArrayEncodedList(lineCodec)

// Stores rows as named JSON objects: [{"product": "Widget", "qty": 3, "price": 9.99}, ...]
// Column names come from the codec — no redundant list to maintain
val objectType: DuckDbType<List<OrderLine>> =
DuckDbTypes.jsonObjectEncodedList(lineCodec)

Every database has these methods:

DatabaseMethods
PostgreSQLPgTypes.jsonArrayEncoded / jsonbArrayEncoded + object variants
MariaDB/MySQLMariaTypes.jsonArrayEncoded + object variants
DuckDBDuckDbTypes.jsonArrayEncoded + object variants
SQL ServerSqlServerTypes.jsonArrayEncoded + object variants
OracleOracleTypes.jsonArrayEncoded + object variants
DB2Db2Types.jsonArrayEncoded + object variants

Each method also has a List variant (e.g. jsonArrayEncodedList) for columns that hold a JSON array of rows.

Aggregating Child Rows as JSON

The real power of JSON-encoded types shows when you aggregate child rows directly in SQL. Instead of N+1 queries, use your database's JSON aggregation function and parse the result with the same type:

DatabaseAggregation function
PostgreSQLjson_agg() / jsonb_agg()
MariaDB/MySQLJSON_ARRAYAGG()
DuckDBjson_group_array()
SQL ServerFOR JSON PATH
OracleJSON_ARRAYAGG()
DB2JSON_ARRAYAGG()
// Aggregate child rows as JSON in a single query
fun getOrderLines(customerId: Int): List<OrderLine> =
sql { """
SELECT json_group_array(json_array(product, qty, price))
FROM order_lines
WHERE customer_id = ${DuckDbTypes.integer(customerId)}
""" }
.query(RowCodec.of(linesType).exactlyOne())
.transact(tx)

The type reads the same types your RowCodec defines — no separate deserialization layer, no mapping code, no drift between your SQL types and your JSON types.