package
0.0.0-20241121195025-46bed5c1815d
Repository: https://github.com/niklasfasching/x.git
Documentation: pkg.go.dev

# README

Differentiating features:

  • Extensible (allows for user defined pseudo-classes/functions, matchers and combinators)
  • Conversion from compiled Selector back to Selector string
  • Small (~800 LOC), simple/modular (separation into lexing, parsing & selecting) and fast (benchmarks put it head to head with cascadia)

#+begin_src go import ( "log" "strings"

"github.com/niklasfasching/x/css"
"golang.org/x/net/html"

)

func main() { doc, _ := html.Parse(strings.NewReader( <p> <span class="a">apple</span> <span class="b">banana</span> <span class="b">berry</span> <span class="c">pear</span> </p> ))

selector := css.MustCompile("p > span.b")
nodes := css.All(selector, doc)
for _, n := range nodes {
	var s strings.Builder
	html.Render(&s, n)
	log.Println(s.String())
}
// <span class="b">banana</span>
// <span class="b">berry</span>

log.Printf("Converted back to string: %s", selector) // Converted back to string: p > span.b

// easy to add your own custom pseudo classes, pseudo functions, matchers & combinators
css.PseudoClasses["my-pseudo-p"] = func(n *html.Node) bool { return n.Data == "p" },
selector = css.MustCompile(":my-pseudo-p")
var s strings.Builder
html.Render(&s, css.First(selector, doc))
log.Println(s.String()) // <p>...</p>

} #+end_src

** but why? for fun

  • It seemed easy enough to do
  • I've been really into writing parsers lately and this felt a belt more complicated than a lisp
  • I enjoy web scraping and wanted to learn more about underlying concepts
  • I wanted to learn more about profiling go - and this seemed like a good playground
  • The existing css libraries cannot be extended / customized
  • resources