I have looked at examples online on how to create a geometry field with peewee and this is what I came up with so far:
class GeometryField(Field):
db_field = 'geometry'
def db_value(self, value):
return fn.ST_GeomFromGeoJSON(value)
def python_value(self, value):
return fn.ST_AsGeoJSON(value)
I have added this definition to a table like so:
class GeoJSON(BaseModel):
geojson_id = UUIDField(primary_key=True, default=uuid.uuid4)
geometry = GeometryField()
Now, this thing wouldn't run and I don't understand what I missed to make it happen.
My aim is to manage insertions of geometric entities into the DB so that later I can make use of PostGIS to query based on locations.
The error I'm getting in the init phase:
peewee.ProgrammingError: syntax error at or near "NOT" LINE 1:
...geojson_id" UUID NOT NULL PRIMARY KEY, "geometry" NOT NULL)
I init the table like so:
GeoJSON.create_table("geojsons")
What did I miss here? Did I need to do anything else before this geometryfield can be used?
Is there a secret geom field that Peewee supports out of the box that I don't know about?
The problem was that the DB failed to install PostGIS and therefore didn't recognize the geometry field from the DB.
Once I fixed that and had the extension installed, the solution above worked perfectly.
If you are coming back to this question later. The Peewee Syntax has changed since version 3.0. db_field has changed to field_type.
So to get it working:
Make sure you install the PostGIS extension on your postgres database.
CREATE EXTENSION postgis;
Create a custom field.
class GeometryField(Field):
field_type = 'geometry'
def db_value(self, value):
return fn.ST_GeomFromGeoJSON(value)
def python_value(self, value):
return fn.ST_AsGeoJSON(value)
Use the custom field in your model
class GeoJSON(BaseModel):
geojson_id = UUIDField(primary_key=True, default=uuid.uuid4)
geometry = GeometryField()
Peewee Docs
Related
I'm trying to use the drift library (renamed from moor) ,but it's unable to create tables because of this error:
SqliteException(1): parameters prohibited in CHECK constraints, SQL logic error (code 1)
CREATE TABLE IF NOT EXISTS enterprise (name TEXT NOT NULL CHECK(LENGTH(name) <= ?), ..other fields);
This is the table class causing the error:
class Enterprise extends Table {
TextColumn get name =>
text().check(name.length.isSmallerOrEqualValue(maxNameLength))();
// ...other fields
}
The error goes away if I remove the check. Could someone explain why the check isn't working ?
Turns out it's a bug with drift. This is the workaround as suggested by the author of drift, and will be fixed in a future release.
Replace
check(name.length.isSmallerOrEqualValue(maxNameLength))
With this:
check(name.length.isSmallerOrEqual(Constant(maxNameLength)))
I'm loading some data from a CSV file that has some extra notes fields that I don't want in the DB. Is there an option to just ignore extra fields when storing to the DB?
I think mongoose did this by default - which does have a downside that stuff goes missing without warning if your schema is wrong but... thats what i want in this case.
Otherwise what is a way to reflect and get the schema so I can remove extra fields from the data manually?
I'm getting this error on .create
Unknown arg `notes` in data.notes for type WalletCreateInput.
Did you mean `name`?
Available args:
...
It's not allowed to add extra fields while interacting with Prisma Query.
The current behaviour is intentional and it throws the error as an extra validation layer to make sure you're passing the right data.
There is a feature request that discusses allowing extra fields while interacting with Query.
As of now, destructuring the fields which are necessary and passing only the required fields is the only option.
Late to the party, but there is a way around this.
If you use the "fieldReference" preview feature:
generator client {
provider = "prisma-client-js"
previewFeatures = ["fieldReference"]
}
You can then create the following to strip out any extra keys.
function stripPrisma<T extends {}>(input: {fields:{}},data: T) : T {
let validKeys = Object.keys(input.fields);
let dataCopy: any = {...data};
for(let key of Object.keys(data)) {
if(!(validKeys.includes(key))) {
delete dataCopy[key];
}
}
return dataCopy as T;
}
And use it like this
data = stripPrisma(prisma.myTable, data);
prisma.myTable.create({data:data});
It is not perfect, since it will only be able to use "checked input", meaning you can only use the foreign key in your input and not the foreign object.
(It's similar to this, but this time it knows it's POSTGRES.)
I have an UpdateQuery:
val uq = kontxt.updateQuery(table("felhaszn"))
uq.addValue(field("kilepett"), null as Timestamp?
and it generates (println(uq.getSQL())):
update felhaszn set /*...*/ kilepett = cast(? as varchar) /*...*/
Why varchar?
At uq.execute() it throws an
ERROR: column "kilepett" is of type timestamp without time zone but expression is of type character varying
While the output of println(uq.toString()) shows the correct SQL. So I could work around it by kontxt.query(uq.toString()).execute() rather than just uq.execute().
This wouldn't happen if you would be using the code generator, in case of which you'd have type information attached to your columns, and jOOQ could bind the correct type:
val uq = kontxt.updateQuery(FELHASZN)
uq.addValue(FELHASZN.KILEPETT, null as Timestamp?)
Your cast to Timestamp? doesn't help here, because that type information isn't maintained by the JVM at runtime. The null reference has no type, and your field("kilepett") doesn't have any type information attached to it, either (it's a Field<Object?>). If for some reason you cannot use the code generator (I highly recommend you do!), you'll have to attach this information explicitly:
uq.addValue(field("kilepett", SQLDataType.TIMESTAMP), null as Timestamp?)
I try to build an application with symfony 3.3.1 using a Postgres 9.2 database. The database schema already exists and can't be changed.
In the following example I'm trying to fetch a Stud object from the database, it is connected with some other tables by foreign keys. Some columns have the type "timestamp(6) without timezone".
class StudentController extends Controller
{
/**
* #Route("/student")
*/
public function indexAction()
{
$studRepository = $this->getDoctrine()->getRepository(Stud::class);
$student = $studRepository->findOneBy(['sid' => 123456]);
var_dump($student);
}
}
I get the following error message: "Could not convert database value "2017-05-26 12:21:30.565211+02" to Doctrine Type datetimetz. Expected format: Y-m-d H:i:sO"
I tried to add something like this to the config.yml without success.
doctrine:
dbal:
mapping_types:
datetimetz: datetime
I also found a hint on this site 18. Known Vendor Issues, but I don't know how to use something like
Type::overrideType('datetimetz', 'Doctrine\DBAL\Types\VarDateTimeType');
properly. I hope somebody has a tip for me, thanks!
I'm using Pdo_Mssql adapter against a Sybase database and working around issues encountered. One pesky issue remaining is Zend_Db's instance on quoting BIT field values. When running the following for an insert:
$row = $this->createRow();
...
$row->MyBitField = $data['MyBitField'];
...
$row->save();
FreeTDS log output shows:
dbutil.c:87:msgno 257: "Implicit conversion from datatype 'VARCHAR' to 'BIT' is not allowed. Use the CONVERT function to run this query.
I've tried casting values as int and bool, but this seems to be a table metadata problem, not a data type problem with input.
Fortunately, Zend_Db_Expr works nicely. The following works, but I'd like to be database server agnostic.
$row->MyBitField = new Zend_Db_Expr("CONVERT(BIT, {$data['MyBitField']})");
I've verified that the describeTable() is returning BIT for the field. Any ideas on how to get ZF to stop quoting MS SQL/Sybase BIT fields?
You can simply try this (works for mysql bit type):
$row->MyBitField = new Zend_Db_Expr($data['MyBitField']);