Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
Given a description of a database and its tables, the Goblimey Scaffolder creates that database and generates a Go web server that provides the Create, Read, Update and Delete (CRUD) operations on it. The server is designed according to the Model, View, Controller (MVC) architecture and is implemented using RESTful requests.
A simple PHP model with no dependencies that will accomplish 90+% of what you want to do with reading or writing data from a database including relational data between tables.