I need advice on Go integration testing using Postgres db in Docker - postgresql

I am very new to Go world. I have some db functions that I need to test.
So first I have a database.go file that connects to a postgres db:
import (
"fmt"
"gorm.io/driver/postgres"
"gorm.io/gorm"
"os"
)
var DB *gorm.DB
var err error
func Open() error {
dsn := fmt.Sprintf("host=%s user=%s password=%s dbname=%s port=%s sslmode=disable", os.Getenv("HOST"), os.Getenv("USER"),
os.Getenv("PASSWORD"), os.Getenv("DB"), os.Getenv("PORT"))
DB, err = gorm.Open(postgres.Open(dsn), &gorm.Config{})
if err != nil {
return err
}
return nil
}
Then I have a customers.go file with functions that interact with that db:
import (
"customers/cmd/database"
"time"
)
type Customers struct {
ID int
CustomerName string
Active bool
Balance float32
ActiveSince time.Time
}
func Get(id int) (Customers, error) {
var customer Customers
result := database.DB.First(&customer, id)
if result.Error != nil {
return Customers{}, result.Error
} else {
return customer, nil
}
}
This is all running in docker, there is customers container and postgres container. Now the question is how do I test my Get(id int) function? I was researching dockertest but that spins up a different db and my Get function uses the one I specified in database.go. So is there a standard Go way to test these functions?

it is a docker net problem but golang:
you can create a docker net, run both container in one net.doc
or use network --network=host
export postgres container's port to localhost, and customers container link to localhost,-pxx:xx

there is standard way of unit tesing in go. pl refer testing and testify/assert. Unit tests are typically written in xxx_test.go file next to the code.
coming to unit testing of db access layer code, one option would be to have a testenv helper and use it on these lines.
customers_test.go:
package dbaccess
import "testing"
func TestDbAccessLayer(t *testing.T) {
testEnv := testUtils.NewTestEnv()
// testEnv should do the required initialization e.g.
// start any mocked services, connection to database, global logger etc.
if err := testEnv.Start(); err != nil {
t.Fatal(err)
}
// at the end of the test, stop need to reset state
// e.g. clear any entries created by test in db
defer testEnv.Stop()
// add test code
// add required assertion to validate
}
have a separate docker-compose.yml file and use it with docker compose command to start/stop services like postgresdb.
go test command may be used to run the tests. refer to the command docs for details.

Related

Concurrent index creation fails when done in a Go program

I am trying to create some concurrent indexes using the command CRETAE INDEX CONCURRENTLY ..... through migrations in my golang project. But whenever I run that particular migration it just takes infinitely long and is never executed.
I went and checked for the logs of the POSTGRES DB and found this thing:
The weird thing is only in migrations I am not able to create concurrent indexes whereas in my main.go if i just directly write code to execute the query it is executing successfully and even on golang's DB query console it is able to create a index concurrently.
Here is my migration package code:
func NewGorm(d *gorm.DB) *GORM {
return &GORM{db: d}
}
func (g *GORM) Run(m Migrator, app, name, methods string, logger log.Logger) error {
g.txn = g.db.Begin()
ds := &datastore.DataStore{ORM: g.db}
if methods == UP {
err = m.Up(ds, logger)
} else {
err = m.Down(ds, logger)
}
if err != nil {
g.rollBack()
return &errors.Response{Reason: "error encountered in running the migration", Detail: err}
}
g.commit()
return nil
}
I know it has something to do with transactions, but i also tried disabling it by passing flag SkipDefaultTransaction: true when initializing the connection with GORM but that also didn't worked and results were the same.
Please help how can i create concurrent indexes in migrations using GORM.
Let me try to help you with the issue. First, you should update the question with all of the source code so we can understand better what's going on and help you in a more accurate way. Anyway, I'll try to help you with what we have so far. I was able to achieve your goal with the following code:
package main
import (
"fmt"
"gorm.io/driver/postgres"
"gorm.io/gorm"
)
type Post struct {
Id int
Title string `gorm:"index:idx_concurrent,option:CONCURRENTLY"`
}
type GORM struct {
db *gorm.DB
}
func main() {
dsn := "host=localhost user=postgres password=postgres dbname=postgres port=5432 sslmode=disable"
db, err := gorm.Open(postgres.Open(dsn), &gorm.Config{})
if err != nil {
panic(err)
}
db.AutoMigrate(&Post{})
m := db.Migrator()
if idxFound := m.HasIndex(&Post{}, "idx_concurrent"); !idxFound {
fmt.Println("idx missing")
return
}
fmt.Println("idx already present")
}
To achieve what you need, it should be enough to add a gorm annotation next to the field where you want to add the index (e.g. Title). In this annotation you can specify to create this index in a concurrent to avoid locking the table. Then, I used the gorm.Migrator to check for the index existence.
If you've already the table, you can simply add the annotation to the model struct definition and gorm will take care of it when you'll run the AutoMigrate method.
Thanks to this you should be able to cover all of the possible scenario you might face.
Let me know if this solves your issue or if you need something else, thanks!

What driver name do I use to connect Go sqlx to Postgres using the pgx driver?

Or alternatively, what additional imports do I need?
I'd like to start using Postgres as my main DBMS for some development I am doing, and my research suggests that pgx is the database driver of choice at this time. I have setup a connection package:
package database
import (
"fmt"
_ "github.com/jackc/pgx/v5"
"net/url"
"github.com/jmoiron/sqlx"
log "github.com/sirupsen/logrus"
)
func PgConnect(username, password, host, app string) *sqlx.DB {
s := fmt.Sprintf("postgres://%s:%s#%s?app+name=%s&sslmode=allow", url.QueryEscape(username), url.QueryEscape(password), host, app)
db, err := sqlx.Connect("postgres", s)
if err != nil {
log.Panicf("failed to connect to sqlserver://%s:%s#%s err: %s", username, "xxxxxxxxxxxx", host, err)
}
return db
}
The URL evaluates to:
postgres://user:password#database-host:5432/database-name?app+name=app-name&sslmode=allow
and I've also tried the prefix of postgresql here because I've seen both in places on the internet.
For the driver name in the sqlx.Connect("postgres", s) call I have tried postgres, postgresql and pgx.
In all cases the call to connect fails with the error:
sql: unknown driver "postgres" (forgotten import?)
The same code (with driver mssql and mssql URL) works connection to Microsoft SQL Server.
Can anyone help me get this going? I've found very little on the internet for this combination of language/driver/sqlx/postgres.
From the documentation, we can see that the driver's name is pgx:
A database/sql connection can be established through sql.Open.
db, err := sql.Open("pgx", "postgres://pgx_md5:secret#localhost:5432/pgx_test?sslmode=disable")
if err != nil {
return err
}
So you'll need to do two things:
Use the proper driver name:
db, err := sqlx.Connect("pgx", s)
Import the stdlib compatibility pacakge:
import (
_ "github.com/jackc/pgx/v5/stdlib" // Standard library bindings for pgx
)

Mongo-go-driver: context deadline exceeded

I have recently upgraded to the newer and offical golang mongo driver for an app I am working on.
All is work prefectly for my local development however when I hook it up and point to my backend server I am getting a 'context deadline exceeded' when calling the client.Ping(...) method.
The old driver code still works fine and I also print out the connection string and can copy and paste this into the compass app and it works without issues.
However for the life of me I cant work out why this new code is return a context timeout. Only different thing is that mongo is running on a non-standard port of 32680 and I am also using the mgm package. However it just using the offical mongo driver under the hood.
Mongo version is: 4.0.12 (locally and remote)
Connection code is here:
// NewClient creates a mongo DateBase connection
func NewClient(cfg config.Mongo) (*Client, error) {
// create database connection string
conStr := fmt.Sprintf("mongodb://%s:%s#%s:%s", cfg.Username, cfg.Password, cfg.Host, cfg.Port)
// set mgm conf ie ctxTimeout value
conf := mgm.Config{CtxTimeout: cfg.CtxTimeout}
// setup mgm / DateBase connection
err := mgm.SetDefaultConfig(&conf, cfg.Database, options.Client().ApplyURI(conStr))
if err != nil {
return nil, errors.Wrapf(err, "failed to connect to mongodb. cfg: %+v. conStr: %+v.", cfg, conStr)
}
// get access to underlying mongodb client driver, db and mgmConfig. Need for adding additional tools like seeding/migrations/etc
mgmCfg, client, db, err := mgm.DefaultConfigs()
if err != nil {
return nil, errors.Wrap(err, "failed to return mgm.DefaultConfigs")
}
// NOTE: fails here!
if err := client.Ping(mgm.Ctx(), readpref.Primary()); err != nil {
return nil, errors.Wrapf(err, "Ping failed to mongodb. cfg: %+v. conStr: %+v. mgmCfg: %+v", cfg, conStr, mgmCfg)
}
return &Client{
cfg: cfg,
mgmCfg: mgmCfg,
client: client,
db: db,
}, nil
}
HELP! I have no idea how I can debug this anymore that I have?
Try adding your authsource in your DSN,
something like
mongodb://USER:PASSWORD#HOST:PORT/DBNAME?authsource=AUTHSOURCE

Using the MongoDB Go Driver, how do I set up connection pooling?

I'm using the MongoDB Go Driver in my Go (1.11) server which runs on Google Cloud's App Engine. I'm not really sure if I still manually have to set up connection pooling or if it's already being taken care of out of the box. For example I'm not entirely sure what the context (with timeout) exactly means.
My code looks like this:
package tools
import (
"context"
"time"
"valuation-app/settings"
"go.mongodb.org/mongo-driver/mongo"
"go.mongodb.org/mongo-driver/mongo/options"
)
// ConnectToDB starts a new database connection and returns a reference to it
func ConnectToDB() (*mongo.Database, error) {
settings := settings.Get().Database
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
options := options.Client().ApplyURI(settings.URI)
options.SetMaxPoolSize(10)
client, err := mongo.Connect(ctx, options)
if err != nil {
return nil, err
}
return client.Database(settings.DatabaseName), nil
}
From comments by #Alongkorn:
Mongo-go-driver takes care of connection pooling by default(100 connection by default)
You can read the source code here

TDD with database and Go

I'm trying to wrap my head around test driven development with Go and having an issue testing my CRUD functions since they are written for my production database. I'm coming from Ruby on Rails so I am used to using a test database, but Go doesn't seem to be too friendly in this regard.
So, how does one go about testing CRUD with Go?
main.go
package main
import (
"database/sql"
)
type book struct {
id int `json:"id"`
isbn string `json:"isbn"`
title string `json:"title"`
author string `json:"author"`
price float32 `json:"price"`
}
// type Books []*Book
// CRUD functions for Book
func (b *book) getBook(db *sql.DB) error {
return db.QueryRow("SELECT * FROM books WHERE id=$1", b.id).Scan(&b)
}
app.go
func (a *App) Initialize(dbname string) {
var err error
a.DB, err = sql.Open("postgres", "postgresql://localhost:5432/bookstore?sslmode=disable")
if err != nil {
log.Fatal(err)
}
}
my test
func TestGetBook(t *testing.T) {
clearTable()
addBook(1)
req, _ := http.NewRequest("GET", "/book/1", nil)
response := executeRequest(req)
checkResponseCode(t, http.StatusOK, response.Code)
}
The problem is that this keeps on looking at the books table in my DB, not the books_test table I'd like to use for testing. How can I go about making ONLY the tests use the books_test DB?
You should create a dev/test database which should be a complete copy of your production database. You will never want to run test directly against your production database since too many unexpected issues could happen.
A workaround would be starting up your app first, which creates the connection to your database, then run the test. You can use IntelliJ to achieve this.
TDD in my opinion is great for developing business logic layer code since new models and business processes can have unexpected impacts on existing ones.
#Godzilla74, there'are 2 solutions: enable SSL for test DB (try to check database settings or ask your system administrator) of have completely different setting for test:
func (a *App) Initialize(dbname string) {
var err error
pgsettings := os.Getenv("PGSETTINGS")
if pgsettins == "" {
// default options if not overridden
pgsettins := "postgresql://localhost:5432/bookstore?sslmode=disable"
}
a.DB, err = sql.Open("postgres", pgsettins)
if err != nil {
log.Fatal(err)
}
}
So you can run set environment setting to any required value and run app, like so:
export PGSETTINGS="postgresql://localhost:5432/bookstore_test?sslmode=disable"
go run main.go