Skip to content

Session Persistence

bonk can save and restore browser state (cookies) between sessions. This is useful for maintaining login sessions across automation runs.

Save State

err := ctx.SaveState("./session.dat")

Serializes cookies to a JSON file.

Load State

On Context Creation

ctx, err := b.NewContext(bonk.WithState("./session.dat"))

On Existing Context

err := ctx.LoadState("./session.dat")

What's Saved

State persistence saves and restores:

  • All cookies (name, value, domain, path, expiry, httpOnly, secure, sameSite)
  • localStorage per origin (from all open pages at save time)

IndexedDB

IndexedDB is not saved. It uses binary data and complex object stores that can't be easily serialized. If you need IndexedDB persistence, handle it manually via page.Evaluate.

For finer-grained control, use the cookie methods directly:

// get all cookies
cookies, err := ctx.Cookies()

// set specific cookies
err = ctx.SetCookies(
    bonk.Cookie{
        Name:     "session",
        Value:    "abc123",
        Domain:   ".example.com",
        Path:     "/",
        Secure:   true,
        HTTPOnly: true,
    },
)

// clear all cookies
err = ctx.ClearCookies()

Use Cases

Login Once, Reuse Session

// first run: log in and save
page.Navigate("https://app.example.com/login")
page.Fill("#email", "user@example.com")
page.Fill("#password", "secret")
page.Click("#submit")
page.WaitForURL("**/dashboard*")
ctx.SaveState("./session.dat")

// subsequent runs: restore session
ctx, _ := b.NewContext(bonk.WithState("./session.dat"))
page, _ := ctx.NewPage()
page.Navigate("https://app.example.com/dashboard") // already logged in

Resumable Crawlers

// save progress periodically
for i, url := range urls {
    page.Navigate(url)
    // ... process page ...

    if i%10 == 0 {
        ctx.SaveState("./crawler-state.dat")
    }
}