I am using Golang and Postgres, Postgres has an advance feature where it can return your queries in Json format. What I want to do is get that Json query results and return it but I am having trouble since it has to be a String in order to return it. This is my code
package main
import(
"fmt"
"database/sql"
_ "github.com/lib/pq"
"log"
)
func HelloServer(w http.ResponseWriter, req *http.Request) {
db, err := sql.Open("postgres", "user=postgres password=password dbname=name sslmode=disable")
if err != nil {
log.Fatal(err)
}
defer db.Close()
rows, err := db.Query("select To_Json(t) (SELECT * from cars)t")
io.WriteString(w, "hello, world!\n")
}
func main() {
http.HandleFunc("/hello", HelloServer)
log.Fatal(http.ListenAndServe(":12345", nil))
}
The Rows element returns a Json array how can I turn that Rows element into a String ? For C# and Java I would just append the .ToString() method to it and it would make it a string . As you can see from the code above the io.WriteString takes a String as a second parameter so I want to make the Rows variable a String after it has the Json returned so that I can display it in the browser by passing it to the method. I want to replace the Hello World with the String Rows.
Rows is a sql.Rows type. In order to use the data returned by your database query you will have to iterated over the "rows".
An example from the docs
age := 27
rows, err := db.Query("SELECT name FROM users WHERE age=?", age)
if err != nil {
log.Fatal(err)
}
defer rows.Close()
for rows.Next() {
var name string
if err := rows.Scan(&name); err != nil {
log.Fatal(err)
}
fmt.Printf("%s is %d\n", name, age)
}
if err := rows.Err(); err != nil {
log.Fatal(err)
}
You should instead use QueryRow because you are expecting the database to return one result. In either case once you have used "Scan" to put the data into your own variable then you can either parse the JSON or print it out.
Related
I'm trying to query from multiple databases. Each database is connected using the following function:
func connectDB(dbEnv str) *sql.DB{
// Loading environment variables from local.env file
err1 := godotenv.Load(dbEnv)
if err1 != nil {
log.Fatalf("Some error occured. Err: %s", err1)
}
dialect := os.Getenv("DIALECT")
host := os.Getenv("HOST")
dbPort := os.Getenv("DBPORT")
user := os.Getenv("USER")
dbName := os.Getenv("NAME")
password := os.Getenv("PASSWORD")
// Database connection string
dbURI := fmt.Sprintf("port=%s host=%s user=%s "+"password=%s dbname=%s sslmode=disable", dbPort, host, user, password, dbName)
// Create database object
db, err := sql.Open(dialect,dbURI)
if err != nil {
log.Fatal(err)
}
return db
}
type order struct{
OrderID string `json:"orderID"`
Name string `json:"name"`
}
type book struct{
OrderID string `json:"orderID"`
Name string `json:"name"`
}
func getOrders(db *sql.DB) []order {
var (
orderID string
name string
)
var allRows = []order{}
query := `
SELECT orderID, name
FROM orders.orders;
`
//Get rows using the query
rows, err := db.Query(query)
if err != nil { //Log if error
log.Fatal(err)
}
defer rows.Close()
// Add each row into the "allRows" slice
for rows.Next() {
err := rows.Scan(&orderID, &name, &date)
if err != nil {
log.Fatal(err)
}
//Create new order struct with the received data
row := order{
OrderID: orderID,
Name: name,
}
allRows = append(allRows, row)
}
//Log if error
err = rows.Err()
if err != nil {
log.Fatal(err)
}
return allRows
}
func getBooks(db *sql.DB) []book{
var (
bookID string
name string
)
var allRows = []book{}
query := `
SELECT bookID, name
FROM books.books;
`
//Get rows using the query
rows, err := db.Query(query)
if err != nil { //Log if error
log.Fatal(err)
}
defer rows.Close()
// Add each row into the "allRows" slice
for rows.Next() {
err := rows.Scan(&bookID, &name)
if err != nil {
log.Fatal(err)
}
//Create new book struct with the received data
row := book{
BookID: bookID,
Name: name,
}
allRows = append(allRows, row)
}
//Log if error
err = rows.Err()
if err != nil {
log.Fatal(err)
}
return allRows
}
func main() {
ordersDB:= connectDB("ordersDB.env")
booksDB:= connectDB("booksDB.env")
orders := getOrders(ordersDB)
books := getBooks(booksDB)
}
The issue is that when I use ordersDB first, the program only recognizes the table in ordersDB. And when I use booksDB first, the program only recognizes the table in booksDB.
When I try to query a table in booksDB after using ordersDB, it is giving me "relation "books.books" does not exist" error. When I try to query a table in ordersDB after using booksDB, it gives "relation "orders.orders" does not exist"
Is there a better way to connect to multiple databases?
You are using github.com/joho/godotenv to load the database configuration from the environment. Summarising (and cutting out a lot of detail) what you are doing is:
godotenv.Load("ordersDB.env")
host := os.Getenv("HOST")
// Connect to DB
godotenv.Load("booksDB.env")
host := os.Getenv("HOST")
// Connect to DB 2
However as stated in the docs "Existing envs take precedence of envs that are loaded later". This is also stated more clearly here "It's important to note that it WILL NOT OVERRIDE an env variable that already exists".
So your code will load in the first .env file, populate the environment variables, and connect to the database. You will then load the second .env file but, because the environmental variables are already set, they will not be changed and you will connect to the same database a second time.
As a work around you could use Overload. However it's probably better to reconsider your use of environmental variables (and perhaps use different variables for the second connection).
I'm using SELECT * in a db.query() to return columns from a table. Typically, I would fmt.Scan() the rows into a pre-declared struct{} for further manipulation, but in this case, the table columns change frequently so I'm not able to use a declared struct{} as part of my Scan().
I've been struggling to figure out how I might dynamically build a struct{} based on the column results of the db.query() which I could subsequently call on the use of for Scan(). I've read a little about reflect but I'm struggling to determine if this is right for my use-case or if I might have to think about something else.
Any pointers would be greatly appreciated.
you can get column names from resulting rowset and prepare a slice for the scan.
Example (https://go.dev/play/p/ilYmEIWBG5S) :
package main
import (
"database/sql"
"fmt"
"log"
"github.com/DATA-DOG/go-sqlmock"
)
func main() {
// mock db
db, mock, err := sqlmock.New()
if err != nil {
log.Fatal(err)
}
columns := []string{"id", "status"}
mock.ExpectQuery("SELECT \\* FROM table").
WillReturnRows(sqlmock.NewRows(columns).AddRow(1, "ok"))
// actual code
rows, err := db.Query("SELECT * FROM table")
if err != nil {
log.Fatal(err)
}
cols, err := rows.Columns()
if err != nil {
log.Fatal(err)
}
data := make([]interface{}, len(cols))
strs := make([]sql.NullString, len(cols))
for i := range data {
data[i] = &strs[i]
}
for rows.Next() {
if err := rows.Scan(data...); err != nil {
log.Fatal(err)
}
for i, d := range data {
fmt.Printf("%s = %+v\n", cols[i], d)
}
}
}
This example reads all columns into strings. To detect column type one can use rows.ColumnTypes method.
I am trying to insert data that is stored in an array of struct into a MongoDB using the go.mongodb.org/mongo-driver library. My struct is
type Statement struct {
ProductID string `bson:"product_id" json:"product_id"`
ModelNum string `bson:"model_num" json:"model_num"`
Title string `bson:"title" json:"title"`
}
and my insertion code is
func insert(stmts []Statement) {
client, err := mongo.NewClient(options.Client().ApplyURI("mongodb://127.0.0.1:27017"))
if err != nil {
log.Fatal(err)
}
ctx, _ := context.WithTimeout(context.Background(), 10*time.Second)
err = client.Connect(ctx)
if err != nil {
log.Fatal(err)
}
defer client.Disconnect(ctx)
quickstartDatabase := client.Database("quickstart")
testCollection := quickstartDatabase.Collection("test")
testCollection.InsertMany(ctx, stmts) // This is giving error
}
The compiler gives the error cannot use stmts (variable of type []Statement) as []interface{} value in argument to testCollection.InsertMany at the InsertMany command.
I have tried marshalling the struct before inserting using bson.Marshal but even that doesn't work. How do I insert this data into the DB?
insertMany accept []interface{}
i would make like this
newValue := make([]interface{}, len(statements))
for i := range statements {
newValue[i] = statements[i]
}
col.InsertMany(ctx, newValue)
I am writing an abstraction over the official mongo driver which consists of a struct containing a pointer to the collection needed and CRUD methods on it. In order to be able to work with multiple types(all of which have bson adnotations) I use an interface called Storable, but I don't see a way in which I could decode the field without knowing the type exactly. Code snippet:
func (c *Collection) GetAll() ([]models.Storable, error) {
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := c.coll.Find(ctx, bson.M{})
if err != nil {
return nil, err
}
var result []models.Storable
for cursor.Next(ctx) {
var doc models.Storable
err = cursor.Decode(&doc)
if err != nil {
return nil, err
}
result = append(result, doc)
}
return result, nil
}
type example:
type User struct {
ID primitive.ObjectID `json:"id" bson:"_id,omitempty"`
FirstName string `json:"firstName" bson:"firstName"`
LastName string `json:"lastName" bson:"lastName"`
Email string `json:"email" bson:"email"`
Password string `json:"password" bson:"password"`
}
You need to pass in the correct type to decode to somehow. To be able to pass in different types you could use the empty interface{}. However if you pass in the entire slice (e.g. []User) into interface{} you cannot append to it any more without complex usage of reflection.
Using a function to create a new row
As #mkopriva mentioned in the comments, we could pass a function creating a new row to it (or pass it to the collection on initialisation):
func (c *Collection) GetAll(newRow func() models.Storable) ([]models.Storable, error) {
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := c.coll.Find(ctx, bson.M{})
if err != nil {
return nil, err
}
var result []models.Storable
for cursor.Next(ctx) {
nRow := newRow()
err = cursor.Decode(nRow)
if err != nil {
return nil, err
}
result = append(result, nRow.(models.Storable))
}
return result, nil
}
You would then call with a function returning the correct type (must be a pointer):
rows, err := c.GetAll(func() models.Storable {
return new(User)
})
Here my reference implementation using json decoding to test this: Playground
Using reflection to create a new row
You could pass in the variable type for a single row:
func (c *Collection) GetAll(row models.Storable) ([]models.Storable, error) {
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
cursor, err := c.coll.Find(ctx, bson.M{})
if err != nil {
return nil, err
}
var result []models.Storable
for cursor.Next(ctx) {
nRow := reflect.New(reflect.TypeOf(p)).Interface()
err = cursor.Decode(nRow)
if err != nil {
return nil, err
}
result = append(result, nRow.(models.Storable))
}
return result, nil
}
You would then call this with the correct type:
var user User
rows, err := c.GetAll(user)
Note that I don't use a pointer type here! The pointer is taken via reflection in the function.
Here my reference implementation using json decoding to test this: Playground
The downside of the "one function for all" is that you now have a slice of models.Storable with different data inside and you have to use a type switch or type assertion to work with the data.
For this reason I don't use generic functions for database calls but create a new function for every call I need: e.g. GetAllUsers, GetUserByID, GetAllContacts, etc.
Note: this is one of the things I will rethink when generics get added to Go.
I'm brand new to Go, and I've started working on some postgres queries, and I'm having very little luck.
I have a package that's just going to have some database queries in it. Here's my code.
main.go
package main
import (
"fmt"
)
func main() {
fmt.Println("Querying data")
myqueries.SelectAll("mytable")
}
myqueries.go
package myqueries
import (
"database/sql"
"fmt"
)
func SelectAll (table string) {
db, err := sql.Open("postgres","user=postgres dbname=mydb sslmode=disable")
if err != nil {
fmt.Println(err)
}
defer db.Close()
rows, err := db.Query("SELECT * FROM $1", table)
if err != nil {
fmt.Println(err)
} else {
PrintRows(rows)
}
}
func PrintRows(rows *sql.Rows) {
for rows.Next() {
var firstname string
var lastname string
err := rows.Scan(&firstname, &lastname)
if err != nil {
fmt.Println(err)
}
fmt.Println("first name | last name")
fmt.Println("%v | %v\n", firstname, lastname)
}
}
The error I get is pq: syntax error at or near "$1"
which is from myqueries.go file in the db.Query.
I've tried several variations of this, but nothing has worked yet. Any help is appreciated.
It looks like you are using https://github.com/lib/pq based on the error message and it's docs say that
pq uses the Postgres-native ordinal markers, as shown above
I've never known a database engine that allows the parameterized values in anything other than values. I think you are going to have to resort to string concatenation. I don't have a Go compiler available to me right now, but try something like this. Because you are inserting the table name by concatination, you need it sanitized. pq.QuoteIdentifier should be able to help with that.
func SelectAll (table string) {
db, err := sql.Open("postgres","user=postgres dbname=mydb sslmode=disable")
if err != nil {
fmt.Println(err)
}
defer db.Close()
table = pq.QuoteIdentifier(table)
rows, err := db.Query(fmt.Sprintf("SELECT * FROM %v", table))
if err != nil {
fmt.Println(err)
} else {
PrintRows(rows)
}
}
EDIT: Thanks to hobbs to pointing out pq.QuoteIdentifier