Compare commits

..

1 Commits
4.2.7 ... 4.0.4

Author SHA1 Message Date
4250b854c9 chore: bump version to 4.0.4
Create release containing typo fixes (#580).
2020-08-23 15:26:36 -04:00
646 changed files with 29942 additions and 48437 deletions

View File

@ -1,11 +0,0 @@
version: 2
updates:
- package-ecosystem: gomod
directory: "/"
schedule:
interval: daily
open-pull-requests-limit: 10
ignore:
- dependency-name: github.com/alecthomas/chroma
versions:
- 0.9.1

View File

@ -1,57 +0,0 @@
name: Go
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
# TODO: is it possible to DRY out these jobs? Aside from `runs-on`, they are
# identical.
build-linux:
runs-on: [ ubuntu-latest ]
steps:
- uses: actions/checkout@v2
- name: Set up Go
uses: actions/setup-go@v2
with:
go-version: 1.18
- name: Set up Revive (linter)
run: go get -u github.com/boyter/scc github.com/mgechev/revive
env:
GO111MODULE: off
- name: Build
run: make build
- name: Test
run: make test
build-osx:
runs-on: [ macos-latest ]
steps:
- uses: actions/checkout@v2
- name: Set up Go
uses: actions/setup-go@v2
with:
go-version: 1.18
- name: Set up Revive (linter)
run: go get -u github.com/boyter/scc github.com/mgechev/revive
env:
GO111MODULE: off
- name: Build
run: make build
- name: Test
run: make test
# TODO: windows

View File

@ -1,36 +0,0 @@
name: CodeQL
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
schedule:
- cron: '45 23 * * 0'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
language: [ 'go' ]
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
with:
languages: ${{ matrix.language }}
- name: Autobuild
uses: github/codeql-action/autobuild@v1
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

15
.travis.yml Normal file
View File

@ -0,0 +1,15 @@
language: go
go:
- 1.14.x
os:
- linux
- osx
env:
- GO111MODULE=on
install: true
script: make ci

View File

@ -19,8 +19,7 @@ tracker][issues] to discuss with the maintainer whether it would be considered
for merging. for merging.
`cheat` is mostly mature and feature-complete, but may still have some room for `cheat` is mostly mature and feature-complete, but may still have some room for
new features. See [HACKING.md][hacking] for a quick-start guide to `cheat` new features.
development.
#### Add documentation #### #### Add documentation ####
Did you encounter features, bugs, edge-cases, use-cases, or environment Did you encounter features, bugs, edge-cases, use-cases, or environment
@ -36,13 +35,9 @@ Are you unable to do the above, but still want to contribute? You can help
`cheat` simply by telling others about it. Share it with friends and coworkers `cheat` simply by telling others about it. Share it with friends and coworkers
that might benefit from using it. that might benefit from using it.
#### Pull Requests ####
Please open all pull-requests against the `develop` branch.
[cheat]: https://github.com/cheat/cheat [cheat]: https://github.com/cheat/cheat
[cheatsheets]: https://github.com/cheat/cheatsheets [cheatsheets]: https://github.com/cheat/cheatsheets
[hacking]: HACKING.md
[issues]: https://github.com/cheat/cheat/issues [issues]: https://github.com/cheat/cheat/issues
[pr]: https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork [pr]: https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork
[wiki]: https://github.com/cheat/cheat/wiki [wiki]: https://github.com/cheat/cheat/wiki

View File

@ -1,8 +0,0 @@
# NB: this image isn't used anywhere in the build pipeline. It exists to
# conveniently facilitate ad-hoc experimentation in a sandboxed environment
# during development.
FROM golang:1.15-alpine
RUN apk add git less make
WORKDIR /app

View File

@ -1,57 +0,0 @@
Hacking
=======
The following is a quickstart guide for developing `cheat`.
## 1. Install system dependencies
Before you begin, you must install a handful of system dependencies. The
following are required, and must be available on your `PATH`:
- `git`
- `go` (>= 1.17 is recommended)
- `make`
The following dependencies are optional:
- `docker`
- `pandoc` (necessary to generate a `man` page)
## 2. Install utility applications
Run `make setup` to install `scc` and `revive`, which are used by various
`make` targets.
## 3. Development workflow
After your environment has been configured, your development workflow will
resemble the following:
1. Make changes to the `cheat` source code.
2. Run `make test` to run unit-tests.
3. Fix compiler errors and failing tests as necessary.
4. Run `make`. A `cheat` executable will be written to the `dist` directory.
5. Use the new executable by running `dist/cheat <command>`.
6. Run `make install` to install `cheat` to your `PATH`.
7. Run `make build-release` to build cross-platform binaries in `dist`.
8. Run `make clean` to clean the `dist` directory when desired.
You may run `make help` to see a list of available `make` commands.
### Developing with docker
It may be useful to test your changes within a pristine environment. An
Alpine-based docker container has been provided for that purpose.
If you would like to build the docker container, run:
```sh
make docker-setup
```
To shell into the container, run:
```sh
make docker-sh
```
The `cheat` source code will be mounted at `/app` within the container.
If you would like to destroy this container, you may run:
```sh
make distclean
```
[go]: https://go.dev/

View File

@ -1,77 +0,0 @@
Installing
==========
`cheat` has no runtime dependencies. As such, installing it is generally
straightforward. There are a few methods available:
### Install manually
#### Unix-like
On Unix-like systems, you may simply paste the following snippet into your terminal:
```sh
cd /tmp \
&& wget https://github.com/cheat/cheat/releases/download/4.2.7/cheat-linux-amd64.gz \
&& gunzip cheat-linux-amd64.gz \
&& chmod +x cheat-linux-amd64 \
&& sudo mv cheat-linux-amd64 /usr/local/bin/cheat
```
You may need to need to change the version number (`4.2.7`) and the archive
(`cheat-linux-amd64.gz`) depending on your platform.
See the [releases page][releases] for a list of supported platforms.
#### Windows
TODO: community support is requested here. Please open a PR if you'd like to
contribute installation instructions for Windows.
### Install via `go install`
If you have `go` version `>=1.17` available on your `PATH`, you can install
`cheat` via `go install`:
```sh
go install github.com/cheat/cheat/cmd/cheat@latest
```
### Install via package manager
Several community-maintained packages are also available:
Package manager | Installing
---------------- | -----------
[brew][] | `brew install cheat`
[docker][] | `alias cheat='docker run --rm bannmann/docker-cheat'`
[nix][] | `nix-env -iA nixos.cheat`
[snap][] | `snap install cheat`
<!--[pacman][] |-->
## Configuring
Three things must be done before you can use `cheat`:
1. A config file must be generated
2. [`cheatpaths`][cheatpaths] must be configured
3. [Community cheatsheets][community] must be downloaded
On first run, `cheat` will run an installer that will do all of the above
automatically. After the installer is complete, it is strongly advised that you
view the configuration file that was generated, as you may want to change some
of its default values (to enable colorization, change the paginator, etc).
### conf.yml ###
`cheat` is configured by a YAML file that will be auto-generated on first run.
By default, the config file is assumed to exist on an XDG-compliant
configuration path like `~/.config/cheat/conf.yml`. If you would like to store
it elsewhere, you may export a `CHEAT_CONFIG_PATH` environment variable that
specifies its path:
```sh
export CHEAT_CONFIG_PATH="~/.dotfiles/cheat/conf.yml"
```
[brew]: https://formulae.brew.sh/formula/cheat
[cheatpaths]: README.md#cheatpaths
[community]: https://github.com/cheat/cheatsheets/
[docker]: https://github.com/bannmann/docker-cheat
[nix]: https://search.nixos.org/packages?channel=unstable&show=cheat&from=0&size=50&sort=relevance&type=packages&query=cheat
[pacman]: #
[releases]: https://github.com/cheat/cheat/releases
[snap]: https://snapcraft.io/cheat

View File

@ -7,7 +7,6 @@ dist_dir := ./dist
CAT := cat CAT := cat
COLUMN := column COLUMN := column
CTAGS := ctags CTAGS := ctags
DOCKER := docker
GO := go GO := go
GREP := grep GREP := grep
GZIP := gzip --best GZIP := gzip --best
@ -21,8 +20,6 @@ SED := sed
SORT := sort SORT := sort
ZIP := zip -m ZIP := zip -m
docker_image := cheat-devel:latest
# build flags # build flags
BUILD_FLAGS := -ldflags="-s -w" -mod vendor -trimpath BUILD_FLAGS := -ldflags="-s -w" -mod vendor -trimpath
GOBIN := GOBIN :=
@ -36,18 +33,21 @@ releases := \
$(dist_dir)/cheat-linux-arm5 \ $(dist_dir)/cheat-linux-arm5 \
$(dist_dir)/cheat-linux-arm6 \ $(dist_dir)/cheat-linux-arm6 \
$(dist_dir)/cheat-linux-arm7 \ $(dist_dir)/cheat-linux-arm7 \
$(dist_dir)/cheat-linux-arm64 \
$(dist_dir)/cheat-windows-amd64.exe $(dist_dir)/cheat-windows-amd64.exe
## build: build an executable for your architecture ## build: build an executable for your architecture
.PHONY: build .PHONY: build
build: $(dist_dir) clean fmt lint vet vendor generate man build: $(dist_dir) clean vendor generate man
$(GO) build $(BUILD_FLAGS) -o $(dist_dir)/cheat $(cmd_dir) $(GO) build $(BUILD_FLAGS) -o $(dist_dir)/cheat $(cmd_dir)
## build-release: build release executables ## build-release: build release executables
.PHONY: build-release .PHONY: build-release
build-release: $(releases) build-release: $(releases)
## ci: build a "release" executable for the current architecture (used in ci)
.PHONY: ci
ci: | setup prepare build
# cheat-darwin-amd64 # cheat-darwin-amd64
$(dist_dir)/cheat-darwin-amd64: prepare $(dist_dir)/cheat-darwin-amd64: prepare
GOARCH=amd64 GOOS=darwin \ GOARCH=amd64 GOOS=darwin \
@ -77,16 +77,11 @@ $(dist_dir)/cheat-linux-arm6: prepare
$(dist_dir)/cheat-linux-arm7: prepare $(dist_dir)/cheat-linux-arm7: prepare
GOARCH=arm GOOS=linux GOARM=7 \ GOARCH=arm GOOS=linux GOARM=7 \
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz $(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
# cheat-linux-arm64
$(dist_dir)/cheat-linux-arm64: prepare
GOARCH=arm64 GOOS=linux \
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
# cheat-windows-amd64 # cheat-windows-amd64
$(dist_dir)/cheat-windows-amd64.exe: prepare $(dist_dir)/cheat-windows-amd64.exe: prepare
GOARCH=amd64 GOOS=windows \ GOARCH=amd64 GOOS=windows \
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(ZIP) $@.zip $@ -j $(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(ZIP) $@.zip $@
# ./dist # ./dist
$(dist_dir): $(dist_dir):
@ -110,7 +105,6 @@ clean: $(dist_dir)
.PHONY: distclean .PHONY: distclean
distclean: distclean:
$(RM) -f tags $(RM) -f tags
@$(DOCKER) image rm -f $(docker_image)
## setup: install revive (linter) and scc (sloc tool) ## setup: install revive (linter) and scc (sloc tool)
.PHONY: setup .PHONY: setup
@ -138,10 +132,6 @@ man:
vendor: vendor:
$(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify $(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify
## vendor-update: update vendored dependencies
vendor-update:
$(GO) get -t -u ./... && $(GO) mod vendor
## fmt: run go fmt ## fmt: run go fmt
.PHONY: fmt .PHONY: fmt
fmt: fmt:
@ -175,16 +165,6 @@ check: | vendor fmt lint vet test
.PHONY: prepare .PHONY: prepare
prepare: | $(dist_dir) clean generate vendor fmt lint vet test prepare: | $(dist_dir) clean generate vendor fmt lint vet test
## docker-setup: create a docker image for use during development
.PHONY: docker-setup
docker-setup:
$(DOCKER) build -t $(docker_image) -f Dockerfile .
## docker-sh: shell into the docker development container
.PHONY: docker-sh
docker-sh:
$(DOCKER) run -v $(shell pwd):/app -ti $(docker_image) /bin/ash
## help: display this help text ## help: display this help text
.PHONY: help .PHONY: help
help: help:

180
README.md
View File

@ -1,9 +1,8 @@
![Workflow status](https://github.com/cheat/cheat/actions/workflows/build.yml/badge.svg)
cheat cheat
===== =====
[![Build Status](https://travis-ci.com/cheat/cheat.svg?branch=master)](https://travis-ci.com/cheat/cheat)
`cheat` allows you to create and view interactive cheatsheets on the `cheat` allows you to create and view interactive cheatsheets on the
command-line. It was designed to help remind \*nix system administrators of command-line. It was designed to help remind \*nix system administrators of
options for commands that they use frequently, but not frequently enough to options for commands that they use frequently, but not frequently enough to
@ -42,6 +41,99 @@ tar -xjvf '/path/to/foo.tgz'
tar -cjvf '/path/to/foo.tgz' '/path/to/foo/' tar -cjvf '/path/to/foo.tgz' '/path/to/foo/'
``` ```
Installing
----------
`cheat` has no dependencies. To install it, download the executable from the
[releases][] page and place it on your `PATH`.
Configuring
-----------
### conf.yml ###
`cheat` is configured by a YAML file that will be auto-generated on first run.
Should you need to create a config file manually, you can do
so via:
```sh
mkdir -p ~/.config/cheat && cheat --init > ~/.config/cheat/conf.yml
```
By default, the config file is assumed to exist on an XDG-compliant
configuration path like `~/.config/cheat/conf.yml`. If you would like to store
it elsewhere, you may export a `CHEAT_CONFIG_PATH` environment variable that
specifies its path:
```sh
export CHEAT_CONFIG_PATH="~/.dotfiles/cheat/conf.yml"
```
Cheatsheets
-----------
Cheatsheets are plain-text files with no file extension, and are named
according to the command used to view them:
```sh
cheat tar # file is named "tar"
cheat foo/bar # file is named "bar", in a "foo" subdirectory
```
Cheatsheet text may optionally be preceeded by a YAML frontmatter header that
assigns tags and specifies syntax:
```
---
syntax: javascript
tags: [ array, map ]
---
// To map over an array:
const squares = [1, 2, 3, 4].map(x => x * x);
```
The `cheat` executable includes no cheatsheets, but [community-sourced
cheatsheets are available][cheatsheets]. You will be asked if you would like to
install the community-sourced cheatsheets the first time you run `cheat`.
Cheatpaths
----------
Cheatsheets are stored on "cheatpaths", which are directories that contain
cheatsheets. Cheatpaths are specified in the `conf.yml` file.
It can be useful to configure `cheat` against multiple cheatpaths. A common
pattern is to store cheatsheets from multiple repositories on individual
cheatpaths:
```yaml
# conf.yml:
# ...
cheatpaths:
- name: community # a name for the cheatpath
path: ~/documents/cheat/community # the path's location on the filesystem
tags: [ community ] # these tags will be applied to all sheets on the path
readonly: true # if true, `cheat` will not create new cheatsheets here
- name: personal
path: ~/documents/cheat/personal # this is a separate directory and repository than above
tags: [ personal ]
readonly: false # new sheets may be written here
# ...
```
The `readonly` option instructs `cheat` not to edit (or create) any cheatsheets
on the path. This is useful to prevent merge-conflicts from arising on upstream
cheatsheet repositories.
If a user attempts to edit a cheatsheet on a read-only cheatpath, `cheat` will
transparently copy that sheet to a writeable directory before opening it for
editing.
### Directory-scoped Cheatpaths ###
At times, it can be useful to closely associate cheatsheets with a directory on
your filesystem. `cheat` facilitates this by searching for a `.cheat` folder in
the current working directory. If found, the `.cheat` directory will
(temporarily) be added to the cheatpaths.
Usage Usage
----- -----
To view a cheatsheet: To view a cheatsheet:
@ -102,77 +194,7 @@ cheat -p personal -t networking --regex -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
``` ```
Advanced Usage
Installing
----------
For installation and configuration instructions, see [INSTALLING.md][].
Cheatsheets
-----------
Cheatsheets are plain-text files with no file extension, and are named
according to the command used to view them:
```sh
cheat tar # file is named "tar"
cheat foo/bar # file is named "bar", in a "foo" subdirectory
```
Cheatsheet text may optionally be preceeded by a YAML frontmatter header that
assigns tags and specifies syntax:
```
---
syntax: javascript
tags: [ array, map ]
---
// To map over an array:
const squares = [1, 2, 3, 4].map(x => x * x);
```
The `cheat` executable includes no cheatsheets, but [community-sourced
cheatsheets are available][cheatsheets]. You will be asked if you would like to
install the community-sourced cheatsheets the first time you run `cheat`.
Cheatpaths
----------
Cheatsheets are stored on "cheatpaths", which are directories that contain
cheatsheets. Cheatpaths are specified in the `conf.yml` file.
It can be useful to configure `cheat` against multiple cheatpaths. A common
pattern is to store cheatsheets from multiple repositories on individual
cheatpaths:
```yaml
# conf.yml:
# ...
cheatpaths:
- name: community # a name for the cheatpath
path: ~/documents/cheat/community # the path's location on the filesystem
tags: [ community ] # these tags will be applied to all sheets on the path
readonly: true # if true, `cheat` will not create new cheatsheets here
- name: personal
path: ~/documents/cheat/personal # this is a separate directory and repository than above
tags: [ personal ]
readonly: false # new sheets may be written here
# ...
```
The `readonly` option instructs `cheat` not to edit (or create) any cheatsheets
on the path. This is useful to prevent merge-conflicts from arising on upstream
cheatsheet repositories.
If a user attempts to edit a cheatsheet on a read-only cheatpath, `cheat` will
transparently copy that sheet to a writeable directory before opening it for
editing.
### Directory-scoped Cheatpaths ###
At times, it can be useful to closely associate cheatsheets with a directory on
your filesystem. `cheat` facilitates this by searching for a `.cheat` folder in
the current working directory. If found, the `.cheat` directory will
(temporarily) be added to the cheatpaths.
Autocompletion
-------------- --------------
Shell autocompletion is currently available for `bash`, `fish`, and `zsh`. Copy Shell autocompletion is currently available for `bash`, `fish`, and `zsh`. Copy
the relevant [completion script][completions] into the appropriate directory on the relevant [completion script][completions] into the appropriate directory on
@ -185,9 +207,7 @@ Additionally, `cheat` supports enhanced autocompletion via integration with
1. Ensure that `fzf` is available on your `$PATH` 1. Ensure that `fzf` is available on your `$PATH`
2. Set an envvar: `export CHEAT_USE_FZF=true` 2. Set an envvar: `export CHEAT_USE_FZF=true`
[INSTALLING.md]: INSTALLING.md [Releases]: https://github.com/cheat/cheat/releases
[Releases]: https://github.com/cheat/cheat/releases [cheatsheets]: https://github.com/cheat/cheatsheets
[cheatsheets]: https://github.com/cheat/cheatsheets [completions]: https://github.com/cheat/cheat/tree/master/scripts
[completions]: https://github.com/cheat/cheat/tree/master/scripts [fzf]: https://github.com/junegunn/fzf
[fzf]: https://github.com/junegunn/fzf
[go]: https://golang.org

View File

@ -1,4 +1,3 @@
//go:build ignore
// +build ignore // +build ignore
// This script embeds `docopt.txt and `conf.yml` into the binary during at // This script embeds `docopt.txt and `conf.yml` into the binary during at
@ -6,11 +5,13 @@
package main package main
import ( import (
"fmt" "fmt"
"io/ioutil" "io/ioutil"
"log" "log"
"os" "os"
"path"
"path/filepath" "path/filepath"
) )
@ -51,10 +52,10 @@ func main() {
for _, file := range files { for _, file := range files {
// delete the outfile // delete the outfile
os.Remove(filepath.Join(root, file.Out)) os.Remove(path.Join(root, file.Out))
// read the static template // read the static template
bytes, err := ioutil.ReadFile(filepath.Join(root, file.In)) bytes, err := ioutil.ReadFile(path.Join(root, file.In))
if err != nil { if err != nil {
log.Fatal(err) log.Fatal(err)
} }
@ -63,7 +64,7 @@ func main() {
data := template(file.Method, string(bytes)) data := template(file.Method, string(bytes))
// write the file to the specified outpath // write the file to the specified outpath
spath := filepath.Join(root, file.Out) spath := path.Join(root, file.Out)
err = ioutil.WriteFile(spath, []byte(data), 0644) err = ioutil.WriteFile(spath, []byte(data), 0644)
if err != nil { if err != nil {
log.Fatal(err) log.Fatal(err)

View File

@ -18,10 +18,14 @@ func cmdDirectories(opts map[string]interface{}, conf config.Config) {
// generate sorted, columnized output // generate sorted, columnized output
for _, path := range conf.Cheatpaths { for _, path := range conf.Cheatpaths {
fmt.Fprintf(w, "%s:\t%s\n", path.Name, path.Path) fmt.Fprintln(w, fmt.Sprintf(
"%s:\t%s",
path.Name,
path.Path,
))
} }
// write columnized output to stdout // write columnized output to stdout
w.Flush() w.Flush()
display.Write(out.String(), conf) display.Display(out.String(), conf)
} }

View File

@ -4,7 +4,7 @@ import (
"fmt" "fmt"
"os" "os"
"os/exec" "os/exec"
"path/filepath" "path"
"strings" "strings"
"github.com/cheat/cheat/internal/cheatpath" "github.com/cheat/cheat/internal/cheatpath"
@ -20,7 +20,7 @@ func cmdEdit(opts map[string]interface{}, conf config.Config) {
// load the cheatsheets // load the cheatsheets
cheatsheets, err := sheets.Load(conf.Cheatpaths) cheatsheets, err := sheets.Load(conf.Cheatpaths)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to list cheatsheets: %v\n", err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
os.Exit(1) os.Exit(1)
} }
@ -58,10 +58,10 @@ func cmdEdit(opts map[string]interface{}, conf config.Config) {
} }
// compute the new edit path // compute the new edit path
editpath = filepath.Join(writepath.Path, sheet.Title) editpath = path.Join(writepath.Path, sheet.Title)
// create any necessary subdirectories // create any necessary subdirectories
dirs := filepath.Dir(editpath) dirs := path.Dir(editpath)
if dirs != "." { if dirs != "." {
if err := os.MkdirAll(dirs, 0755); err != nil { if err := os.MkdirAll(dirs, 0755); err != nil {
fmt.Fprintf(os.Stderr, "failed to create directory: %s, %v\n", dirs, err) fmt.Fprintf(os.Stderr, "failed to create directory: %s, %v\n", dirs, err)
@ -87,10 +87,10 @@ func cmdEdit(opts map[string]interface{}, conf config.Config) {
} }
// compute the new edit path // compute the new edit path
editpath = filepath.Join(writepath.Path, cheatsheet) editpath = path.Join(writepath.Path, cheatsheet)
// create any necessary subdirectories // create any necessary subdirectories
dirs := filepath.Dir(editpath) dirs := path.Dir(editpath)
if dirs != "." { if dirs != "." {
if err := os.MkdirAll(dirs, 0755); err != nil { if err := os.MkdirAll(dirs, 0755); err != nil {
fmt.Fprintf(os.Stderr, "failed to create directory: %s, %v\n", dirs, err) fmt.Fprintf(os.Stderr, "failed to create directory: %s, %v\n", dirs, err)

View File

@ -3,7 +3,7 @@ package main
import ( import (
"fmt" "fmt"
"os" "os"
"path/filepath" "path"
"runtime" "runtime"
"strings" "strings"
@ -42,11 +42,11 @@ func cmdInit() {
// determine the appropriate paths for config data and (optional) community // determine the appropriate paths for config data and (optional) community
// cheatsheets based on the user's platform // cheatsheets based on the user's platform
confpath := confpaths[0] confpath := confpaths[0]
confdir := filepath.Dir(confpath) confdir := path.Dir(confpath)
// create paths for community and personal cheatsheets // create paths for community and personal cheatsheets
community := filepath.Join(confdir, "cheatsheets", "community") community := path.Join(confdir, "/cheatsheets/community")
personal := filepath.Join(confdir, "cheatsheets", "personal") personal := path.Join(confdir, "/cheatsheets/personal")
// template the above paths into the default configs // template the above paths into the default configs
configs = strings.Replace(configs, "COMMUNITY_PATH", community, -1) configs = strings.Replace(configs, "COMMUNITY_PATH", community, -1)

View File

@ -21,11 +21,11 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
// load the cheatsheets // load the cheatsheets
cheatsheets, err := sheets.Load(conf.Cheatpaths) cheatsheets, err := sheets.Load(conf.Cheatpaths)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to list cheatsheets: %v\n", err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
os.Exit(1) os.Exit(1)
} }
// filter cheatsheets by tag if --tag was provided // filter cheatcheats by tag if --tag was provided
if opts["--tag"] != nil { if opts["--tag"] != nil {
cheatsheets = sheets.Filter( cheatsheets = sheets.Filter(
cheatsheets, cheatsheets,
@ -37,8 +37,8 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
// sheets with local sheets), here we simply want to create a slice // sheets with local sheets), here we simply want to create a slice
// containing all sheets. // containing all sheets.
flattened := []sheet.Sheet{} flattened := []sheet.Sheet{}
for _, pathsheets := range cheatsheets { for _, pathSheets := range cheatsheets {
for _, s := range pathsheets { for _, s := range pathSheets {
flattened = append(flattened, s) flattened = append(flattened, s)
} }
} }
@ -63,7 +63,10 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
// compile the regex // compile the regex
reg, err := regexp.Compile(pattern) reg, err := regexp.Compile(pattern)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to compile regexp: %s, %v\n", pattern, err) fmt.Fprintln(
os.Stderr,
fmt.Sprintf("failed to compile regexp: %s, %v", pattern, err),
)
os.Exit(1) os.Exit(1)
} }
@ -92,10 +95,15 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
// generate sorted, columnized output // generate sorted, columnized output
for _, sheet := range flattened { for _, sheet := range flattened {
fmt.Fprintf(w, "%s\t%s\t%s\n", sheet.Title, sheet.Path, strings.Join(sheet.Tags, ",")) fmt.Fprintln(w, fmt.Sprintf(
"%s\t%s\t%s",
sheet.Title,
sheet.Path,
strings.Join(sheet.Tags, ","),
))
} }
// write columnized output to stdout // write columnized output to stdout
w.Flush() w.Flush()
display.Write(out.String(), conf) display.Display(out.String(), conf)
} }

View File

@ -17,7 +17,7 @@ func cmdRemove(opts map[string]interface{}, conf config.Config) {
// load the cheatsheets // load the cheatsheets
cheatsheets, err := sheets.Load(conf.Cheatpaths) cheatsheets, err := sheets.Load(conf.Cheatpaths)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to list cheatsheets: %v\n", err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
os.Exit(1) os.Exit(1)
} }
@ -37,19 +37,19 @@ func cmdRemove(opts map[string]interface{}, conf config.Config) {
// fail early if the requested cheatsheet does not exist // fail early if the requested cheatsheet does not exist
sheet, ok := consolidated[cheatsheet] sheet, ok := consolidated[cheatsheet]
if !ok { if !ok {
fmt.Fprintf(os.Stderr, "No cheatsheet found for '%s'.\n", cheatsheet) fmt.Fprintln(os.Stderr, fmt.Sprintf("No cheatsheet found for '%s'.\n", cheatsheet))
os.Exit(2) os.Exit(2)
} }
// fail early if the sheet is read-only // fail early if the sheet is read-only
if sheet.ReadOnly { if sheet.ReadOnly {
fmt.Fprintf(os.Stderr, "cheatsheet '%s' is read-only.\n", cheatsheet) fmt.Fprintln(os.Stderr, fmt.Sprintf("cheatsheet '%s' is read-only.", cheatsheet))
os.Exit(1) os.Exit(1)
} }
// otherwise, attempt to delete the sheet // otherwise, attempt to delete the sheet
if err := os.Remove(sheet.Path); err != nil { if err := os.Remove(sheet.Path); err != nil {
fmt.Fprintf(os.Stderr, "failed to delete sheet: %s, %v\n", sheet.Title, err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to delete sheet: %s, %v", sheet.Title, err))
os.Exit(1) os.Exit(1)
} }
} }

View File

@ -8,6 +8,7 @@ import (
"github.com/cheat/cheat/internal/config" "github.com/cheat/cheat/internal/config"
"github.com/cheat/cheat/internal/display" "github.com/cheat/cheat/internal/display"
"github.com/cheat/cheat/internal/sheet"
"github.com/cheat/cheat/internal/sheets" "github.com/cheat/cheat/internal/sheets"
) )
@ -19,7 +20,7 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
// load the cheatsheets // load the cheatsheets
cheatsheets, err := sheets.Load(conf.Cheatpaths) cheatsheets, err := sheets.Load(conf.Cheatpaths)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to list cheatsheets: %v\n", err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
os.Exit(1) os.Exit(1)
} }
@ -31,67 +32,71 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
) )
} }
// iterate over each cheatpath // consolidate the cheatsheets found on all paths into a single map of
out := "" // `title` => `sheet` (ie, allow more local cheatsheets to override less
for _, pathcheats := range cheatsheets { // local cheatsheets)
consolidated := sheets.Consolidate(cheatsheets)
// sort the cheatsheets alphabetically, and search for matches // if <cheatsheet> was provided, search that single sheet only
for _, sheet := range sheets.Sort(pathcheats) { if opts["<cheatsheet>"] != nil {
// if <cheatsheet> was provided, constrain the search only to cheatsheet := opts["<cheatsheet>"].(string)
// matching cheatsheets
if opts["<cheatsheet>"] != nil && sheet.Title != opts["<cheatsheet>"] {
continue
}
// assume that we want to perform a case-insensitive search for <phrase> // assert that the cheatsheet exists
pattern := "(?i)" + phrase s, ok := consolidated[cheatsheet]
if !ok {
fmt.Printf("No cheatsheet found for '%s'.\n", cheatsheet)
os.Exit(2)
}
// unless --regex is provided, in which case we pass the regex unaltered consolidated = map[string]sheet.Sheet{
if opts["--regex"] == true { cheatsheet: s,
pattern = phrase
}
// compile the regex
reg, err := regexp.Compile(pattern)
if err != nil {
fmt.Fprintf(os.Stderr, "failed to compile regexp: %s, %v\n", pattern, err)
os.Exit(1)
}
// `Search` will return text entries that match the search terms.
// We're using it here to overwrite the prior cheatsheet Text,
// filtering it to only what is relevant.
sheet.Text = sheet.Search(reg)
// if the sheet did not match the search, ignore it and move on
if sheet.Text == "" {
continue
}
// if colorization was requested, apply it here
if conf.Color(opts) {
sheet.Colorize(conf)
}
// display the cheatsheet body
out += fmt.Sprintf(
"%s %s\n%s\n",
// append the cheatsheet title
sheet.Title,
// append the cheatsheet path
display.Faint(fmt.Sprintf("(%s)", sheet.CheatPath), conf),
// indent each line of content
display.Indent(sheet.Text),
)
} }
} }
// trim superfluous newlines // sort the cheatsheets alphabetically, and search for matches
out = strings.TrimSpace(out) out := ""
for _, sheet := range sheets.Sort(consolidated) {
// assume that we want to perform a case-insensitive search for <phrase>
pattern := "(?i)" + phrase
// unless --regex is provided, in which case we pass the regex unaltered
if opts["--regex"] == true {
pattern = phrase
}
// compile the regex
reg, err := regexp.Compile(pattern)
if err != nil {
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to compile regexp: %s, %v", pattern, err))
os.Exit(1)
}
// `Search` will return text entries that match the search terms. We're
// using it here to overwrite the prior cheatsheet Text, filtering it to
// only what is relevant
sheet.Text = sheet.Search(reg)
// if the sheet did not match the search, ignore it and move on
if sheet.Text == "" {
continue
}
// if colorization was requested, apply it here
if conf.Color(opts) {
sheet.Colorize(conf)
}
// output the cheatsheet title
out += fmt.Sprintf("%s:\n", sheet.Title)
// indent each line of content with two spaces
for _, line := range strings.Split(sheet.Text, "\n") {
out += fmt.Sprintf(" %s\n", line)
}
}
// display the output // display the output
// NB: resist the temptation to call `display.Write` multiple times in the display.Display(out, conf)
// loop above. That will not play nicely with the paginator.
display.Write(out, conf)
} }

View File

@ -15,7 +15,7 @@ func cmdTags(opts map[string]interface{}, conf config.Config) {
// load the cheatsheets // load the cheatsheets
cheatsheets, err := sheets.Load(conf.Cheatpaths) cheatsheets, err := sheets.Load(conf.Cheatpaths)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to list cheatsheets: %v\n", err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
os.Exit(1) os.Exit(1)
} }
@ -26,5 +26,5 @@ func cmdTags(opts map[string]interface{}, conf config.Config) {
} }
// display the output // display the output
display.Write(out, conf) display.Display(out, conf)
} }

View File

@ -18,7 +18,7 @@ func cmdView(opts map[string]interface{}, conf config.Config) {
// load the cheatsheets // load the cheatsheets
cheatsheets, err := sheets.Load(conf.Cheatpaths) cheatsheets, err := sheets.Load(conf.Cheatpaths)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "failed to list cheatsheets: %v\n", err) fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
os.Exit(1) os.Exit(1)
} }
@ -30,39 +30,9 @@ func cmdView(opts map[string]interface{}, conf config.Config) {
) )
} }
// if --all was passed, display cheatsheets from all cheatpaths // consolidate the cheatsheets found on all paths into a single map of
if opts["--all"].(bool) { // `title` => `sheet` (ie, allow more local cheatsheets to override less
// iterate over the cheatpaths // local cheatsheets)
out := ""
for _, cheatpath := range cheatsheets {
// if the cheatpath contains the specified cheatsheet, display it
if sheet, ok := cheatpath[cheatsheet]; ok {
// identify the matching cheatsheet
out += fmt.Sprintf("%s %s\n",
sheet.Title,
display.Faint(fmt.Sprintf("(%s)", sheet.CheatPath), conf),
)
// apply colorization if requested
if conf.Color(opts) {
sheet.Colorize(conf)
}
// display the cheatsheet
out += display.Indent(sheet.Text) + "\n"
}
}
// display and exit
display.Write(strings.TrimSuffix(out, "\n"), conf)
os.Exit(0)
}
// otherwise, consolidate the cheatsheets found on all paths into a single
// map of `title` => `sheet` (ie, allow more local cheatsheets to override
// less local cheatsheets)
consolidated := sheets.Consolidate(cheatsheets) consolidated := sheets.Consolidate(cheatsheets)
// fail early if the requested cheatsheet does not exist // fail early if the requested cheatsheet does not exist
@ -78,5 +48,5 @@ func cmdView(opts map[string]interface{}, conf config.Config) {
} }
// display the cheatsheet // display the cheatsheet
display.Write(sheet.Text, conf) display.Display(sheet.Text, conf)
} }

View File

@ -3,12 +3,11 @@ Usage:
Options: Options:
--init Write a default config file to stdout --init Write a default config file to stdout
-a --all Search among all cheatpaths
-c --colorize Colorize output -c --colorize Colorize output
-d --directories List cheatsheet directories -d --directories List cheatsheet directories
-e --edit=<cheatsheet> Edit <cheatsheet> -e --edit=<cheatsheet> Edit <cheatsheet>
-l --list List cheatsheets -l --list List cheatsheets
-p --path=<name> Return only sheets found on cheatpath <name> -p --path=<name> Return only sheets found on path <name>
-r --regex Treat search <phrase> as a regex -r --regex Treat search <phrase> as a regex
-s --search=<phrase> Search cheatsheets for <phrase> -s --search=<phrase> Search cheatsheets for <phrase>
-t --tag=<tag> Return only sheets matching <tag> -t --tag=<tag> Return only sheets matching <tag>

View File

@ -5,6 +5,7 @@ package main
import ( import (
"fmt" "fmt"
"os" "os"
"path"
"runtime" "runtime"
"strings" "strings"
@ -16,12 +17,12 @@ import (
"github.com/cheat/cheat/internal/installer" "github.com/cheat/cheat/internal/installer"
) )
const version = "4.2.7" const version = "4.0.4"
func main() { func main() {
// initialize options // initialize options
opts, err := docopt.ParseArgs(usage(), nil, version) opts, err := docopt.Parse(usage(), nil, true, version, false)
if err != nil { if err != nil {
// panic here, because this should never happen // panic here, because this should never happen
panic(fmt.Errorf("docopt failed to parse: %v", err)) panic(fmt.Errorf("docopt failed to parse: %v", err))
@ -45,9 +46,6 @@ func main() {
envvars := map[string]string{} envvars := map[string]string{}
for _, e := range os.Environ() { for _, e := range os.Environ() {
pair := strings.SplitN(e, "=", 2) pair := strings.SplitN(e, "=", 2)
if runtime.GOOS == "windows" {
pair[0] = strings.ToUpper(pair[0])
}
envvars[pair[0]] = pair[1] envvars[pair[0]] = pair[1]
} }
@ -76,16 +74,62 @@ func main() {
os.Exit(0) os.Exit(0)
} }
// choose a confpath // read the config template
confpath = confpaths[0] configs := configs()
// run the installer // determine the appropriate paths for config data and (optional) community
if err := installer.Run(configs(), confpath); err != nil { // cheatsheets based on the user's platform
fmt.Fprintf(os.Stderr, "failed to run installer: %v\n", err) confpath = confpaths[0]
confdir := path.Dir(confpath)
// create paths for community and personal cheatsheets
community := path.Join(confdir, "/cheatsheets/community")
personal := path.Join(confdir, "/cheatsheets/personal")
// template the above paths into the default configs
configs = strings.Replace(configs, "COMMUNITY_PATH", community, -1)
configs = strings.Replace(configs, "PERSONAL_PATH", personal, -1)
// prompt the user to download the community cheatsheets
yes, err = installer.Prompt(
"Would you like to download the community cheatsheets? [Y/n]",
true,
)
if err != nil {
fmt.Fprintf(os.Stderr, "failed to create config: %v\n", err)
os.Exit(1)
}
// clone the community cheatsheets if so instructed
if yes {
// clone the community cheatsheets
if err := installer.Clone(community); err != nil {
fmt.Fprintf(os.Stderr, "failed to create config: %v\n", err)
os.Exit(1)
}
// also create a directory for personal cheatsheets
if err := os.MkdirAll(personal, os.ModePerm); err != nil {
fmt.Fprintf(
os.Stderr,
"failed to create config: failed to create directory: %s: %v\n",
personal,
err)
os.Exit(1)
}
}
// the config file does not exist, so we'll try to create one
if err = config.Init(confpath, configs); err != nil {
fmt.Fprintf(
os.Stderr,
"failed to create config file: %s: %v\n",
confpath,
err,
)
os.Exit(1) os.Exit(1)
} }
// notify the user and exit
fmt.Printf("Created config file: %s\n", confpath) fmt.Printf("Created config file: %s\n", confpath)
fmt.Println("Please read this file for advanced configuration information.") fmt.Println("Please read this file for advanced configuration information.")
os.Exit(0) os.Exit(0)
@ -141,9 +185,6 @@ func main() {
case opts["<cheatsheet>"] != nil: case opts["<cheatsheet>"] != nil:
cmd = cmdView cmd = cmdView
case opts["--tag"] != nil && opts["--tag"].(string) != "":
cmd = cmdList
default: default:
fmt.Println(usage()) fmt.Println(usage())
os.Exit(0) os.Exit(0)

View File

@ -9,22 +9,22 @@ import (
func configs() string { func configs() string {
return strings.TrimSpace(`--- return strings.TrimSpace(`---
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL. # The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
# editor: vim editor: vim
# Should 'cheat' always colorize output? # Should 'cheat' always colorize output?
colorize: false colorize: true
# Which 'chroma' colorscheme should be applied to the output? # Which 'chroma' colorscheme should be applied to the output?
# Options are available here: # Options are available here:
# https://github.com/alecthomas/chroma/tree/master/styles # https://github.com/alecthomas/chroma/tree/master/styles
# style: monokai style: monokai
# Which 'chroma' "formatter" should be applied? # Which 'chroma' "formatter" should be applied?
# One of: "terminal", "terminal256", "terminal16m" # One of: "terminal", "terminal256", "terminal16m"
formatter: terminal formatter: terminal16m
# Through which pager should output be piped? # Through which pager should output be piped? (Unset this key for no pager.)
# pager: less -FRX # <- recommended where available pager: less -FRX
# The paths at which cheatsheets are available. Tags associated with a cheatpath # The paths at which cheatsheets are available. Tags associated with a cheatpath
# are automatically attached to all cheatsheets residing on that path. # are automatically attached to all cheatsheets residing on that path.

View File

@ -12,12 +12,11 @@ func usage() string {
Options: Options:
--init Write a default config file to stdout --init Write a default config file to stdout
-a --all Search among all cheatpaths
-c --colorize Colorize output -c --colorize Colorize output
-d --directories List cheatsheet directories -d --directories List cheatsheet directories
-e --edit=<cheatsheet> Edit <cheatsheet> -e --edit=<cheatsheet> Edit <cheatsheet>
-l --list List cheatsheets -l --list List cheatsheets
-p --path=<name> Return only sheets found on cheatpath <name> -p --path=<name> Return only sheets found on path <name>
-r --regex Treat search <phrase> as a regex -r --regex Treat search <phrase> as a regex
-s --search=<phrase> Search cheatsheets for <phrase> -s --search=<phrase> Search cheatsheets for <phrase>
-t --tag=<tag> Return only sheets matching <tag> -t --tag=<tag> Return only sheets matching <tag>

View File

@ -1,21 +1,21 @@
--- ---
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL. # The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
# editor: vim editor: vim
# Should 'cheat' always colorize output? # Should 'cheat' always colorize output?
colorize: false colorize: true
# Which 'chroma' colorscheme should be applied to the output? # Which 'chroma' colorscheme should be applied to the output?
# Options are available here: # Options are available here:
# https://github.com/alecthomas/chroma/tree/master/styles # https://github.com/alecthomas/chroma/tree/master/styles
# style: monokai style: monokai
# Which 'chroma' "formatter" should be applied? # Which 'chroma' "formatter" should be applied?
# One of: "terminal", "terminal256", "terminal16m" # One of: "terminal", "terminal256", "terminal16m"
formatter: terminal formatter: terminal16m
# Through which pager should output be piped? # Through which pager should output be piped? (Unset this key for no pager.)
# pager: less -FRX # <- recommended where available pager: less -FRX
# The paths at which cheatsheets are available. Tags associated with a cheatpath # The paths at which cheatsheets are available. Tags associated with a cheatpath
# are automatically attached to all cheatsheets residing on that path. # are automatically attached to all cheatsheets residing on that path.

View File

@ -1,4 +1,4 @@
.\" Automatically generated by Pandoc 2.2.1 .\" Automatically generated by Pandoc 1.17.2
.\" .\"
.TH "CHEAT" "1" "" "" "General Commands Manual" .TH "CHEAT" "1" "" "" "General Commands Manual"
.hy .hy
@ -17,62 +17,62 @@ commands that they use frequently, but not frequently enough to
remember. remember.
.SH OPTIONS .SH OPTIONS
.TP .TP
.B \[en]init .B \-\-init
Print a config file to stdout. Print a config file to stdout.
.RS .RS
.RE .RE
.TP .TP
.B \-c, \[en]colorize .B \-c, \-\-colorize
Colorize output. Colorize output.
.RS .RS
.RE .RE
.TP .TP
.B \-d, \[en]directories .B \-d, \-\-directories
List cheatsheet directories. List cheatsheet directories.
.RS .RS
.RE .RE
.TP .TP
.B \-e, \[en]edit=\f[I]CHEATSHEET\f[] .B \-e, \-\-edit=\f[I]CHEATSHEET\f[]
Open \f[I]CHEATSHEET\f[] for editing. Open \f[I]CHEATSHEET\f[] for editing.
.RS .RS
.RE .RE
.TP .TP
.B \-l, \[en]list .B \-l, \-\-list
List available cheatsheets. List available cheatsheets.
.RS .RS
.RE .RE
.TP .TP
.B \-p, \[en]path=\f[I]PATH\f[] .B \-p, \-\-path=\f[I]PATH\f[]
Filter only to sheets found on path \f[I]PATH\f[]. Filter only to sheets found on path \f[I]PATH\f[].
.RS .RS
.RE .RE
.TP .TP
.B \-r, \[en]regex .B \-r, \-\-regex
Treat search \f[I]PHRASE\f[] as a regular expression. Treat search \f[I]PHRASE\f[] as a regular expression.
.RS .RS
.RE .RE
.TP .TP
.B \-s, \[en]search=\f[I]PHRASE\f[] .B \-s, \-\-search=\f[I]PHRASE\f[]
Search cheatsheets for \f[I]PHRASE\f[]. Search cheatsheets for \f[I]PHRASE\f[].
.RS .RS
.RE .RE
.TP .TP
.B \-t, \[en]tag=\f[I]TAG\f[] .B \-t, \-\-tag=\f[I]TAG\f[]
Filter only to sheets tagged with \f[I]TAG\f[]. Filter only to sheets tagged with \f[I]TAG\f[].
.RS .RS
.RE .RE
.TP .TP
.B \-T, \[en]tags .B \-T, \-\-tags
List all tags in use. List all tags in use.
.RS .RS
.RE .RE
.TP .TP
.B \-v, \[en]version .B \-v, \-\-version
Print the version number. Print the version number.
.RS .RS
.RE .RE
.TP .TP
.B \[en]rm=\f[I]CHEATSHEET\f[] .B \-\-rm=\f[I]CHEATSHEET\f[]
Remove (deletes) \f[I]CHEATSHEET\f[]. Remove (deletes) \f[I]CHEATSHEET\f[].
.RS .RS
.RE .RE
@ -88,7 +88,7 @@ cheat \-e \f[I]foo\f[]
.RS .RS
.RE .RE
.TP .TP
.B To edit (or create) the foo/bar cheatsheet on the `work' cheatpath: .B To edit (or create) the foo/bar cheatsheet on the \[aq]work\[aq] cheatpath:
cheat \-p \f[I]work\f[] \-e \f[I]foo/bar\f[] cheat \-p \f[I]work\f[] \-e \f[I]foo/bar\f[]
.RS .RS
.RE .RE
@ -103,7 +103,7 @@ cheat \-l
.RS .RS
.RE .RE
.TP .TP
.B To list all cheatsheets whose titles match `apt': .B To list all cheatsheets whose titles match \[aq]apt\[aq]:
cheat \-l \f[I]apt\f[] cheat \-l \f[I]apt\f[]
.RS .RS
.RE .RE
@ -113,23 +113,23 @@ cheat \-T
.RS .RS
.RE .RE
.TP .TP
.B To list available cheatsheets that are tagged as `personal': .B To list available cheatsheets that are tagged as \[aq]personal\[aq]:
cheat \-l \-t \f[I]personal\f[] cheat \-l \-t \f[I]personal\f[]
.RS .RS
.RE .RE
.TP .TP
.B To search for `ssh' among all cheatsheets, and colorize matches: .B To search for \[aq]ssh\[aq] among all cheatsheets, and colorize matches:
cheat \-c \-s \f[I]ssh\f[] cheat \-c \-s \f[I]ssh\f[]
.RS .RS
.RE .RE
.TP .TP
.B To search (by regex) for cheatsheets that contain an IP address: .B To search (by regex) for cheatsheets that contain an IP address:
cheat \-c \-r \-s \f[I]`(?:[0\-9]{1,3}.){3}[0\-9]{1,3}'\f[] cheat \-c \-r \-s \f[I]\[aq](?:[0\-9]{1,3}.){3}[0\-9]{1,3}\[aq]\f[]
.RS .RS
.RE .RE
.TP .TP
.B To remove (delete) the foo/bar cheatsheet: .B To remove (delete) the foo/bar cheatsheet:
cheat \[en]rm \f[I]foo/bar\f[] cheat \-\-rm \f[I]foo/bar\f[]
.RS .RS
.RE .RE
.SH FILES .SH FILES
@ -159,15 +159,15 @@ depending upon your platform:
\f[B]cheat\f[] will search in the order specified above. \f[B]cheat\f[] will search in the order specified above.
The first \f[I]conf.yaml\f[] encountered will be respected. The first \f[I]conf.yaml\f[] encountered will be respected.
.PP .PP
If \f[B]cheat\f[] cannot locate a config file, it will ask if you'd like If \f[B]cheat\f[] cannot locate a config file, it will ask if you\[aq]d
to generate one automatically. like to generate one automatically.
Alternatively, you may also generate a config file manually by running Alternatively, you may also generate a config file manually by running
\f[B]cheat \[en]init\f[] and saving its output to the appropriate \f[B]cheat \-\-init\f[] and saving its output to the appropriate
location for your platform. location for your platform.
.SS Cheatpaths .SS Cheatpaths
.PP .PP
\f[B]cheat\f[] reads its cheatsheets from \[lq]cheatpaths\[rq], which \f[B]cheat\f[] reads its cheatsheets from "cheatpaths", which are the
are the directories in which cheatsheets are stored. directories in which cheatsheets are stored.
Cheatpaths may be configured in \f[I]conf.yaml\f[], and viewed via Cheatpaths may be configured in \f[I]conf.yaml\f[], and viewed via
\f[B]cheat \-d\f[]. \f[B]cheat \-d\f[].
.PP .PP

9
go.mod
View File

@ -3,16 +3,15 @@ module github.com/cheat/cheat
go 1.14 go 1.14
require ( require (
github.com/alecthomas/chroma v0.10.0 github.com/alecthomas/chroma v0.8.0
github.com/davecgh/go-spew v1.1.1 github.com/davecgh/go-spew v1.1.1
github.com/dlclark/regexp2 v1.7.0 // indirect
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815 github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815
github.com/kr/text v0.2.0 // indirect github.com/kr/text v0.2.0 // indirect
github.com/mattn/go-isatty v0.0.14 github.com/mattn/go-isatty v0.0.12
github.com/mitchellh/go-homedir v1.1.0 github.com/mitchellh/go-homedir v1.1.0
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e // indirect github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e // indirect
golang.org/x/sys v0.0.0-20220731174439-a90be440212d // indirect github.com/sergi/go-diff v1.1.0 // indirect
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f // indirect gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f // indirect
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0 gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0
gopkg.in/yaml.v2 v2.4.0 gopkg.in/yaml.v2 v2.3.0
) )

58
go.sum
View File

@ -1,38 +1,64 @@
github.com/alecthomas/chroma v0.10.0 h1:7XDcGkCQopCNKjZHfYrNLraA+M7e0fMiJ/Mfikbfjek= github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
github.com/alecthomas/chroma v0.10.0/go.mod h1:jtJATyUxlIORhUOFNA9NZDWGAQ8wpxQQqNSB4rjA/1s= github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
github.com/alecthomas/chroma v0.8.0 h1:HS+HE97sgcqjQGu5uVr8jIE55Mmh5UeQ7kckAhHg2pY=
github.com/alecthomas/chroma v0.8.0/go.mod h1:sko8vR34/90zvl5QdcUdvzL3J8NKjAUx9va9jPuFNoM=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
github.com/alecthomas/kong v0.2.4/go.mod h1:kQOmtJgV+Lb4aj+I2LEn40cbtawdWJ9Y8QLq+lElKxE=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E= github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c= github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dlclark/regexp2 v1.4.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc= github.com/dlclark/regexp2 v1.2.0 h1:8sAhBGEM0dRWogWqWyQeIJnxjWO6oIjl8FKqREDsGfk=
github.com/dlclark/regexp2 v1.7.0 h1:7lJfhqlPssTb1WQx4yvTHN0uElPEv52sbaECrAQxjAo= github.com/dlclark/regexp2 v1.2.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
github.com/dlclark/regexp2 v1.7.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815 h1:bWDMxwH3px2JBh6AyO7hdCn/PkvCZXii8TGj7sbtEbQ= github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815 h1:bWDMxwH3px2JBh6AyO7hdCn/PkvCZXii8TGj7sbtEbQ=
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815/go.mod h1:WwZ+bS3ebgob9U8Nd0kOddGdZWjyMGR8Wziv+TBNwSE= github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815/go.mod h1:WwZ+bS3ebgob9U8Nd0kOddGdZWjyMGR8Wziv+TBNwSE=
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ= github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI= github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY= github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE= github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/mattn/go-isatty v0.0.14 h1:yVuAays6BHfxijgZPzw+3Zlu5yQgKGP2/hcQbHb7S9Y= github.com/mattn/go-colorable v0.1.6/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
github.com/mattn/go-isatty v0.0.14/go.mod h1:7GGIvUiUoEMVVmxf/4nioHXj79iQHKdU27kJ6hsGG94= github.com/mattn/go-isatty v0.0.12 h1:wuysRhFDzyxgEmMf5xjvJ2M9dZoWAXNNr5LSBS7uHXY=
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
github.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y= github.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=
github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0= github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e h1:fD57ERR4JtEqsWbfPhv4DMiApHyliiK5xCTNVSPiaAs= github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e h1:fD57ERR4JtEqsWbfPhv4DMiApHyliiK5xCTNVSPiaAs=
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno= github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno=
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
github.com/sergi/go-diff v1.1.0 h1:we8PVUC3FE2uYfodKH/nBHMSetSfHDR6scGdBi+erh0=
github.com/sergi/go-diff v1.1.0/go.mod h1:STckp+ISIX8hZLjrqAeVduY0gWCT9IjLuqbuNXdaHfM=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.7.0 h1:nwc3DEeHmmLAfoZucVR881uASk0Mfjw8xYJ99tb5CcY= github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
golang.org/x/sys v0.0.0-20210630005230-0f9fa26af87c/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
golang.org/x/sys v0.0.0-20220731174439-a90be440212d h1:Sv5ogFZatcgIMMtBSTTAgMYsicp25MXBubjXNDKwm80= github.com/stretchr/testify v1.4.0 h1:2E4SXV/wtOkTonXsotYi4li6zVWxYlZuYNCXe9XRJyk=
golang.org/x/sys v0.0.0-20220731174439-a90be440212d/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg= github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42 h1:vEOn+mP2zCOVzKckCZy6YsCtDblrpj/w7B9nxGNELpg=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 h1:opSr2sbRXk5X5/givKrrKj9HXxFpW2sdCiP8MJSKLQY=
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f h1:BLraFXnmrev5lT+xlilqcH8XK9/i0At2xKjWk4p6zsU= gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f h1:BLraFXnmrev5lT+xlilqcH8XK9/i0At2xKjWk4p6zsU=
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0 h1:POO/ycCATvegFmVuPpQzZFJ+pGZeX22Ufu6fibxDVjU= gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0 h1:POO/ycCATvegFmVuPpQzZFJ+pGZeX22Ufu6fibxDVjU=
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0/go.mod h1:WDnlLJ4WF5VGsH/HVa3CI79GS0ol3YnhVnKP89i0kNg= gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0/go.mod h1:WDnlLJ4WF5VGsH/HVa3CI79GS0ol3YnhVnKP89i0kNg=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY= gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ= gopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo= gopkg.in/yaml.v2 v2.3.0 h1:clyUAQHOM3G0M3f5vQj7LuJrETvjVot3Z5el9nffUtU=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= gopkg.in/yaml.v2 v2.3.0/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=

View File

@ -46,7 +46,7 @@ func TestFilterFailure(t *testing.T) {
} }
// filter the paths // filter the paths
_, err := Filter(paths, "qux") paths, err := Filter(paths, "qux")
if err == nil { if err == nil {
t.Errorf("failed to return an error on non-existent cheatpath") t.Errorf("failed to return an error on non-existent cheatpath")
} }

View File

@ -11,10 +11,12 @@ func Writeable(cheatpaths []Cheatpath) (Cheatpath, error) {
// NB: we're going backwards because we assume that the most "local" // NB: we're going backwards because we assume that the most "local"
// cheatpath will be specified last in the configs // cheatpath will be specified last in the configs
for i := len(cheatpaths) - 1; i >= 0; i-- { for i := len(cheatpaths) - 1; i >= 0; i-- {
// if the cheatpath is not read-only, it is writeable, and thus returned // if the cheatpath is not read-only, it is writeable, and thus returned
if !cheatpaths[i].ReadOnly { if cheatpaths[i].ReadOnly == false {
return cheatpaths[i], nil return cheatpaths[i], nil
} }
} }
// otherwise, return an error // otherwise, return an error

View File

@ -1,22 +0,0 @@
package config
import (
"testing"
)
// TestColor asserts that colorization rules are properly respected
func TestColor(t *testing.T) {
// mock a config
conf := Config{}
opts := map[string]interface{}{"--colorize": false}
if conf.Color(opts) {
t.Errorf("failed to respect --colorize (false)")
}
opts = map[string]interface{}{"--colorize": true}
if !conf.Color(opts) {
t.Errorf("failed to respect --colorize (true)")
}
}

View File

@ -2,10 +2,9 @@ package config
import ( import (
"fmt" "fmt"
"io/ioutil"
"os" "os"
"os/exec"
"path/filepath" "path/filepath"
"runtime"
"strings" "strings"
cp "github.com/cheat/cheat/internal/cheatpath" cp "github.com/cheat/cheat/internal/cheatpath"
@ -28,7 +27,7 @@ type Config struct {
func New(opts map[string]interface{}, confPath string, resolve bool) (Config, error) { func New(opts map[string]interface{}, confPath string, resolve bool) (Config, error) {
// read the config file // read the config file
buf, err := os.ReadFile(confPath) buf, err := ioutil.ReadFile(confPath)
if err != nil { if err != nil {
return Config{}, fmt.Errorf("could not read config file: %v", err) return Config{}, fmt.Errorf("could not read config file: %v", err)
} }
@ -99,22 +98,8 @@ func New(opts map[string]interface{}, confPath string, resolve bool) (Config, er
conf.Editor = os.Getenv("VISUAL") conf.Editor = os.Getenv("VISUAL")
} else if os.Getenv("EDITOR") != "" { } else if os.Getenv("EDITOR") != "" {
conf.Editor = os.Getenv("EDITOR") conf.Editor = os.Getenv("EDITOR")
} else if runtime.GOOS == "windows" {
conf.Editor = "notepad"
} else { } else {
// try to fall back to `nano` return Config{}, fmt.Errorf("no editor set")
path, err := exec.LookPath("nano")
if err != nil {
return Config{}, fmt.Errorf("failed to locate nano: %s", err)
}
// use `nano` if we found it
if path != "" {
conf.Editor = "nano"
// otherwise, give up
} else {
return Config{}, fmt.Errorf("no editor set")
}
} }
} }
@ -125,41 +110,12 @@ func New(opts map[string]interface{}, confPath string, resolve bool) (Config, er
// if a chroma formatter was not provided, set a default // if a chroma formatter was not provided, set a default
if conf.Formatter == "" { if conf.Formatter == "" {
conf.Formatter = "terminal" conf.Formatter = "terminal16m"
} }
// attempt to fall back to `PAGER` if a pager is not specified in configs // if a pager was not provided, set a default
conf.Pager = strings.TrimSpace(conf.Pager) if strings.TrimSpace(conf.Pager) == "" {
if conf.Pager == "" { conf.Pager = ""
// look for `pager`, `less`, and `more` on the system PATH
pagerPath, _ := exec.LookPath("pager")
lessPath, _ := exec.LookPath("less")
morePath, _ := exec.LookPath("more")
// search first for a `PAGER` envvar
if os.Getenv("PAGER") != "" {
conf.Pager = os.Getenv("PAGER")
// search for `pager`
} else if pagerPath != "" {
conf.Pager = pagerPath
// search for `less`
} else if lessPath != "" {
conf.Pager = lessPath
// search for `more`
//
// XXX: this causes issues on some Linux systems. See:
// https://github.com/cheat/cheat/issues/681#issuecomment-1201842334
//
// By checking for `more` last, we're hoping to at least mitigate
// the frequency of this occurrence, because `pager` and `less` are
// likely to be available on most systems on which a user is likely
// to have installed `cheat`.
} else if morePath != "" {
conf.Pager = morePath
}
} }
return conf, nil return conf, nil

View File

@ -39,17 +39,17 @@ func TestConfigSuccessful(t *testing.T) {
// assert that the cheatpaths are correct // assert that the cheatpaths are correct
want := []cheatpath.Cheatpath{ want := []cheatpath.Cheatpath{
cheatpath.Cheatpath{ cheatpath.Cheatpath{
Path: filepath.Join(home, ".dotfiles", "cheat", "community"), Path: filepath.Join(home, ".dotfiles/cheat/community"),
ReadOnly: true, ReadOnly: true,
Tags: []string{"community"}, Tags: []string{"community"},
}, },
cheatpath.Cheatpath{ cheatpath.Cheatpath{
Path: filepath.Join(home, ".dotfiles", "cheat", "work"), Path: filepath.Join(home, ".dotfiles/cheat/work"),
ReadOnly: false, ReadOnly: false,
Tags: []string{"work"}, Tags: []string{"work"},
}, },
cheatpath.Cheatpath{ cheatpath.Cheatpath{
Path: filepath.Join(home, ".dotfiles", "cheat", "personal"), Path: filepath.Join(home, ".dotfiles/cheat/personal"),
ReadOnly: false, ReadOnly: false,
Tags: []string{"personal"}, Tags: []string{"personal"},
}, },
@ -85,8 +85,8 @@ func TestEmptyEditor(t *testing.T) {
// initialize a config // initialize a config
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false) conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
if err != nil { if err == nil {
t.Errorf("failed to initialize test: %v", err) t.Errorf("failed to return an error on empty editor")
} }
// set editor, and assert that it is respected // set editor, and assert that it is respected

View File

@ -2,6 +2,7 @@ package config
import ( import (
"fmt" "fmt"
"io/ioutil"
"os" "os"
"path/filepath" "path/filepath"
) )
@ -15,7 +16,7 @@ func Init(confpath string, configs string) error {
} }
// write the config file // write the config file
if err := os.WriteFile(confpath, []byte(configs), 0644); err != nil { if err := ioutil.WriteFile(confpath, []byte(configs), 0644); err != nil {
return fmt.Errorf("failed to create file: %v", err) return fmt.Errorf("failed to create file: %v", err)
} }

View File

@ -1,37 +0,0 @@
package config
import (
"os"
"testing"
)
// TestInit asserts that configs are properly initialized
func TestInit(t *testing.T) {
// initialize a temporary config file
confFile, err := os.CreateTemp("", "cheat-test")
if err != nil {
t.Errorf("failed to create temp file: %v", err)
}
// clean up the temp file
defer os.Remove(confFile.Name())
// initialize the config file
conf := "mock config data"
if err = Init(confFile.Name(), conf); err != nil {
t.Errorf("failed to init config file: %v", err)
}
// read back the config file contents
bytes, err := os.ReadFile(confFile.Name())
if err != nil {
t.Errorf("failed to read config file: %v", err)
}
// assert that the contents were written correctly
got := string(bytes)
if got != conf {
t.Errorf("failed to write configs: want: %s, got: %s", conf, got)
}
}

View File

@ -1,52 +0,0 @@
package config
import (
"os"
"testing"
)
// TestPathConfigNotExists asserts that `Path` identifies non-existent config
// files
func TestPathConfigNotExists(t *testing.T) {
// package (invalid) cheatpaths
paths := []string{"/cheat-test-conf-does-not-exist"}
// assert
if _, err := Path(paths); err == nil {
t.Errorf("failed to identify non-existent config file")
}
}
// TestPathConfigExists asserts that `Path` identifies existent config files
func TestPathConfigExists(t *testing.T) {
// initialize a temporary config file
confFile, err := os.CreateTemp("", "cheat-test")
if err != nil {
t.Errorf("failed to create temp file: %v", err)
}
// clean up the temp file
defer os.Remove(confFile.Name())
// package cheatpaths
paths := []string{
"/cheat-test-conf-does-not-exist",
confFile.Name(),
}
// assert
got, err := Path(paths)
if err != nil {
t.Errorf("failed to identify config file: %v", err)
}
if got != confFile.Name() {
t.Errorf(
"failed to return config path: want: %s, got: %s",
confFile.Name(),
got,
)
}
}

View File

@ -2,7 +2,7 @@ package config
import ( import (
"fmt" "fmt"
"path/filepath" "path"
"github.com/mitchellh/go-homedir" "github.com/mitchellh/go-homedir"
) )
@ -28,25 +28,25 @@ func Paths(
} }
switch sys { switch sys {
case "android", "darwin", "linux", "freebsd": case "darwin", "linux", "freebsd":
paths := []string{} paths := []string{}
// don't include the `XDG_CONFIG_HOME` path if that envvar is not set // don't include the `XDG_CONFIG_HOME` path if that envvar is not set
if xdgpath, ok := envvars["XDG_CONFIG_HOME"]; ok { if xdgpath, ok := envvars["XDG_CONFIG_HOME"]; ok {
paths = append(paths, filepath.Join(xdgpath, "cheat", "conf.yml")) paths = append(paths, path.Join(xdgpath, "/cheat/conf.yml"))
} }
paths = append(paths, []string{ paths = append(paths, []string{
filepath.Join(home, ".config", "cheat", "conf.yml"), path.Join(home, ".config/cheat/conf.yml"),
filepath.Join(home, ".cheat", "conf.yml"), path.Join(home, ".cheat/conf.yml"),
"/etc/cheat/conf.yml", "/etc/cheat/conf.yml",
}...) }...)
return paths, nil return paths, nil
case "windows": case "windows":
return []string{ return []string{
filepath.Join(envvars["APPDATA"], "cheat", "conf.yml"), path.Join(envvars["APPDATA"], "/cheat/conf.yml"),
filepath.Join(envvars["PROGRAMDATA"], "cheat", "conf.yml"), path.Join(envvars["PROGRAMDATA"], "/cheat/conf.yml"),
}, nil }, nil
default: default:
return []string{}, fmt.Errorf("unsupported os: %s", sys) return []string{}, fmt.Errorf("unsupported os: %s", sys)

View File

@ -21,7 +21,6 @@ func TestValidatePathsNix(t *testing.T) {
// specify the platforms to test // specify the platforms to test
oses := []string{ oses := []string{
"android",
"darwin", "darwin",
"freebsd", "freebsd",
"linux", "linux",

View File

@ -9,9 +9,9 @@ import (
"github.com/cheat/cheat/internal/config" "github.com/cheat/cheat/internal/config"
) )
// Write writes output either directly to stdout, or through a pager, // Display writes output either directly to stdout, or through a pager,
// depending upon configuration. // depending upon configuration.
func Write(out string, conf config.Config) { func Display(out string, conf config.Config) {
// if no pager was configured, print the output to stdout and exit // if no pager was configured, print the output to stdout and exit
if conf.Pager == "" { if conf.Pager == "" {
fmt.Print(out) fmt.Print(out)
@ -23,14 +23,15 @@ func Write(out string, conf config.Config) {
pager := parts[0] pager := parts[0]
args := parts[1:] args := parts[1:]
// configure the pager // run the pager
cmd := exec.Command(pager, args...) cmd := exec.Command(pager, args...)
cmd.Stdin = strings.NewReader(out) cmd.Stdin = strings.NewReader(out)
cmd.Stdout = os.Stdout cmd.Stdout = os.Stdout
// run the pager and handle errors // handle errors
if err := cmd.Run(); err != nil { err := cmd.Run()
fmt.Fprintf(os.Stderr, "failed to write to pager: %v\n", err) if err != nil {
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to write to pager: %v", err))
os.Exit(1) os.Exit(1)
} }
} }

View File

@ -1,18 +0,0 @@
package display
import (
"fmt"
"github.com/cheat/cheat/internal/config"
)
// Faint returns an faint string
func Faint(str string, conf config.Config) string {
// make `str` faint only if colorization has been requested
if conf.Colorize {
return fmt.Sprintf("\033[2m%s\033[0m", str)
}
// otherwise, return the string unmodified
return str
}

View File

@ -1,27 +0,0 @@
package display
import (
"testing"
"github.com/cheat/cheat/internal/config"
)
// TestFaint asserts that Faint applies faint formatting
func TestFaint(t *testing.T) {
// case: apply colorization
conf := config.Config{Colorize: true}
want := "\033[2mfoo\033[0m"
got := Faint("foo", conf)
if want != got {
t.Errorf("failed to faint: want: %s, got: %s", want, got)
}
// case: do not apply colorization
conf.Colorize = false
want = "foo"
got = Faint("foo", conf)
if want != got {
t.Errorf("failed to faint: want: %s, got: %s", want, got)
}
}

View File

@ -1,21 +0,0 @@
package display
import (
"fmt"
"strings"
)
// Indent prepends each line of a string with a tab
func Indent(str string) string {
// trim superfluous whitespace
str = strings.TrimSpace(str)
// prepend each line with a tab character
out := ""
for _, line := range strings.Split(str, "\n") {
out += fmt.Sprintf("\t%s\n", line)
}
return out
}

View File

@ -1,12 +0,0 @@
package display
import "testing"
// TestIndent asserts that Indent prepends a tab to each line
func TestIndent(t *testing.T) {
got := Indent("foo\nbar\nbaz")
want := "\tfoo\n\tbar\n\tbaz\n"
if got != want {
t.Errorf("failed to indent: want: %s, got: %s", want, got)
}
}

View File

@ -8,8 +8,8 @@ import (
const cloneURL = "https://github.com/cheat/cheatsheets.git" const cloneURL = "https://github.com/cheat/cheatsheets.git"
// clone clones the community cheatsheets // Clone clones the community cheatsheets
func clone(path string) error { func Clone(path string) error {
// perform the clone in a shell // perform the clone in a shell
cmd := exec.Command("git", "clone", cloneURL, path) cmd := exec.Command("git", "clone", cloneURL, path)

View File

@ -14,7 +14,7 @@ func Prompt(prompt string, def bool) (bool, error) {
reader := bufio.NewReader(os.Stdin) reader := bufio.NewReader(os.Stdin)
// display the prompt // display the prompt
fmt.Printf("%s: ", prompt) fmt.Print(fmt.Sprintf("%s: ", prompt))
// read the answer // read the answer
ans, err := reader.ReadString('\n') ans, err := reader.ReadString('\n')
@ -23,7 +23,7 @@ func Prompt(prompt string, def bool) (bool, error) {
} }
// normalize the answer // normalize the answer
ans = strings.ToLower(strings.TrimSpace(ans)) ans = strings.ToLower(strings.TrimRight(ans, "\n"))
// return the appropriate response // return the appropriate response
switch ans { switch ans {

View File

@ -1,57 +0,0 @@
package installer
import (
"fmt"
"os"
"path/filepath"
"strings"
"github.com/cheat/cheat/internal/config"
)
// Run runs the installer
func Run(configs string, confpath string) error {
// determine the appropriate paths for config data and (optional) community
// cheatsheets based on the user's platform
confdir := filepath.Dir(confpath)
// create paths for community and personal cheatsheets
community := filepath.Join(confdir, "cheatsheets", "community")
personal := filepath.Join(confdir, "cheatsheets", "personal")
// template the above paths into the default configs
configs = strings.Replace(configs, "COMMUNITY_PATH", community, -1)
configs = strings.Replace(configs, "PERSONAL_PATH", personal, -1)
// prompt the user to download the community cheatsheets
yes, err := Prompt(
"Would you like to download the community cheatsheets? [Y/n]",
true,
)
if err != nil {
return fmt.Errorf("failed to prompt: %v", err)
}
// clone the community cheatsheets if so instructed
if yes {
// clone the community cheatsheets
fmt.Printf("Cloning community cheatsheets to %s.\n", community)
if err := clone(community); err != nil {
return fmt.Errorf("failed to clone cheatsheets: %v", err)
}
// also create a directory for personal cheatsheets
fmt.Printf("Cloning personal cheatsheets to %s.\n", personal)
if err := os.MkdirAll(personal, os.ModePerm); err != nil {
return fmt.Errorf("failed to create directory: %v", err)
}
}
// the config file does not exist, so we'll try to create one
if err = config.Init(confpath, configs); err != nil {
return fmt.Errorf("failed to create config file: %v", err)
}
return nil
}

View File

@ -13,7 +13,7 @@ func Path(filename string) string {
// determine the path of this file during runtime // determine the path of this file during runtime
_, thisfile, _, _ := runtime.Caller(0) _, thisfile, _, _ := runtime.Caller(0)
// compute the mock path // compute the config path
file, err := filepath.Abs( file, err := filepath.Abs(
path.Join( path.Join(
filepath.Dir(thisfile), filepath.Dir(thisfile),
@ -22,7 +22,7 @@ func Path(filename string) string {
), ),
) )
if err != nil { if err != nil {
panic(fmt.Errorf("failed to resolve mock path: %v", err)) panic(fmt.Errorf("failed to resolve config path: %v", err))
} }
return file return file

View File

@ -1,34 +0,0 @@
package sheet
import (
"testing"
"github.com/cheat/cheat/internal/config"
)
// TestColorize asserts that syntax-highlighting is correctly applied
func TestColorize(t *testing.T) {
// mock configs
conf := config.Config{
Formatter: "terminal16m",
Style: "solarized-dark",
}
// mock a sheet
s := Sheet{
Text: "echo 'foo'",
}
// colorize the sheet text
s.Colorize(conf)
// initialize expectations
want := "echo"
want += " 'foo'"
// assert
if s.Text != want {
t.Errorf("failed to colorize sheet: want: %s, got: %s", want, s.Text)
}
}

View File

@ -4,7 +4,7 @@ import (
"fmt" "fmt"
"io" "io"
"os" "os"
"path/filepath" "path"
) )
// Copy copies a cheatsheet to a new location // Copy copies a cheatsheet to a new location
@ -22,7 +22,7 @@ func (s *Sheet) Copy(dest string) error {
defer infile.Close() defer infile.Close()
// create any necessary subdirectories // create any necessary subdirectories
dirs := filepath.Dir(dest) dirs := path.Dir(dest)
if dirs != "." { if dirs != "." {
if err := os.MkdirAll(dirs, 0755); err != nil { if err := os.MkdirAll(dirs, 0755); err != nil {
return fmt.Errorf("failed to create directory: %s, %v", dirs, err) return fmt.Errorf("failed to create directory: %s, %v", dirs, err)

View File

@ -1,6 +1,7 @@
package sheet package sheet
import ( import (
"io/ioutil"
"os" "os"
"path" "path"
"testing" "testing"
@ -12,7 +13,7 @@ func TestCopyFlat(t *testing.T) {
// mock a cheatsheet file // mock a cheatsheet file
text := "this is the cheatsheet text" text := "this is the cheatsheet text"
src, err := os.CreateTemp("", "foo-src") src, err := ioutil.TempFile("", "foo-src")
if err != nil { if err != nil {
t.Errorf("failed to mock cheatsheet: %v", err) t.Errorf("failed to mock cheatsheet: %v", err)
} }
@ -24,7 +25,7 @@ func TestCopyFlat(t *testing.T) {
} }
// mock a cheatsheet struct // mock a cheatsheet struct
sheet, err := New("foo", "community", src.Name(), []string{}, false) sheet, err := New("foo", src.Name(), []string{}, false)
if err != nil { if err != nil {
t.Errorf("failed to init cheatsheet: %v", err) t.Errorf("failed to init cheatsheet: %v", err)
} }
@ -40,7 +41,7 @@ func TestCopyFlat(t *testing.T) {
} }
// assert that the destination file contains the correct text // assert that the destination file contains the correct text
got, err := os.ReadFile(outpath) got, err := ioutil.ReadFile(outpath)
if err != nil { if err != nil {
t.Errorf("failed to read destination file: %v", err) t.Errorf("failed to read destination file: %v", err)
} }
@ -59,7 +60,7 @@ func TestCopyDeep(t *testing.T) {
// mock a cheatsheet file // mock a cheatsheet file
text := "this is the cheatsheet text" text := "this is the cheatsheet text"
src, err := os.CreateTemp("", "foo-src") src, err := ioutil.TempFile("", "foo-src")
if err != nil { if err != nil {
t.Errorf("failed to mock cheatsheet: %v", err) t.Errorf("failed to mock cheatsheet: %v", err)
} }
@ -71,13 +72,7 @@ func TestCopyDeep(t *testing.T) {
} }
// mock a cheatsheet struct // mock a cheatsheet struct
sheet, err := New( sheet, err := New("/cheat-tests/alpha/bravo/foo", src.Name(), []string{}, false)
"/cheat-tests/alpha/bravo/foo",
"community",
src.Name(),
[]string{},
false,
)
if err != nil { if err != nil {
t.Errorf("failed to init cheatsheet: %v", err) t.Errorf("failed to init cheatsheet: %v", err)
} }
@ -93,7 +88,7 @@ func TestCopyDeep(t *testing.T) {
} }
// assert that the destination file contains the correct text // assert that the destination file contains the correct text
got, err := os.ReadFile(outpath) got, err := ioutil.ReadFile(outpath)
if err != nil { if err != nil {
t.Errorf("failed to read destination file: %v", err) t.Errorf("failed to read destination file: %v", err)
} }

View File

@ -2,7 +2,7 @@ package sheet
import ( import (
"fmt" "fmt"
"os" "io/ioutil"
"sort" "sort"
"github.com/cheat/cheat/internal/frontmatter" "github.com/cheat/cheat/internal/frontmatter"
@ -10,26 +10,24 @@ import (
// Sheet encapsulates sheet information // Sheet encapsulates sheet information
type Sheet struct { type Sheet struct {
Title string Title string
CheatPath string Path string
Path string Text string
Text string Tags []string
Tags []string Syntax string
Syntax string ReadOnly bool
ReadOnly bool
} }
// New initializes a new Sheet // New initializes a new Sheet
func New( func New(
title string, title string,
cheatpath string,
path string, path string,
tags []string, tags []string,
readOnly bool, readOnly bool,
) (Sheet, error) { ) (Sheet, error) {
// read the cheatsheet file // read the cheatsheet file
markdown, err := os.ReadFile(path) markdown, err := ioutil.ReadFile(path)
if err != nil { if err != nil {
return Sheet{}, fmt.Errorf("failed to read file: %s, %v", path, err) return Sheet{}, fmt.Errorf("failed to read file: %s, %v", path, err)
} }
@ -48,12 +46,11 @@ func New(
// initialize and return a sheet // initialize and return a sheet
return Sheet{ return Sheet{
Title: title, Title: title,
CheatPath: cheatpath, Path: path,
Path: path, Text: text + "\n",
Text: text + "\n", Tags: tags,
Tags: tags, Syntax: fm.Syntax,
Syntax: fm.Syntax, ReadOnly: readOnly,
ReadOnly: readOnly,
}, nil }, nil
} }

View File

@ -13,7 +13,6 @@ func TestSheetSuccess(t *testing.T) {
// initialize a sheet // initialize a sheet
sheet, err := New( sheet, err := New(
"foo", "foo",
"community",
mock.Path("sheet/foo"), mock.Path("sheet/foo"),
[]string{"alpha", "bravo"}, []string{"alpha", "bravo"},
false, false,
@ -62,7 +61,6 @@ func TestSheetFailure(t *testing.T) {
// initialize a sheet // initialize a sheet
_, err := New( _, err := New(
"foo", "foo",
"community",
mock.Path("/does-not-exist"), mock.Path("/does-not-exist"),
[]string{"alpha", "bravo"}, []string{"alpha", "bravo"},
false, false,
@ -71,20 +69,3 @@ func TestSheetFailure(t *testing.T) {
t.Errorf("failed to return an error on unreadable sheet") t.Errorf("failed to return an error on unreadable sheet")
} }
} }
// TestSheetFrontMatterFailure asserts that an error is returned if the sheet's
// frontmatter cannot be parsed.
func TestSheetFrontMatterFailure(t *testing.T) {
// initialize a sheet
_, err := New(
"foo",
"community",
mock.Path("sheet/bad-fm"),
[]string{"alpha", "bravo"},
false,
)
if err == nil {
t.Errorf("failed to return an error on malformed front-matter")
}
}

View File

@ -2,7 +2,6 @@ package sheets
import ( import (
"fmt" "fmt"
"io/fs"
"os" "os"
"path/filepath" "path/filepath"
"strings" "strings"
@ -56,21 +55,11 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
// contained within hidden directories in the middle of a path, though // contained within hidden directories in the middle of a path, though
// that should not realistically occur. // that should not realistically occur.
if strings.HasPrefix(title, ".") || strings.HasPrefix(info.Name(), ".") { if strings.HasPrefix(title, ".") || strings.HasPrefix(info.Name(), ".") {
// Do not walk hidden directories. This is important, return nil
// because it's common for cheatsheets to be stored in
// version-control, and a `.git` directory can easily
// contain thousands of files.
return fs.SkipDir
} }
// parse the cheatsheet file into a `sheet` struct // parse the cheatsheet file into a `sheet` struct
s, err := sheet.New( s, err := sheet.New(title, path, cheatpath.Tags, cheatpath.ReadOnly)
title,
cheatpath.Name,
path,
cheatpath.Tags,
cheatpath.ReadOnly,
)
if err != nil { if err != nil {
return fmt.Errorf( return fmt.Errorf(
"failed to load sheet: %s, path: %s, err: %v", "failed to load sheet: %s, path: %s, err: %v",

View File

@ -1,62 +1,3 @@
package sheets package sheets
import ( // TODO
"path"
"testing"
"github.com/cheat/cheat/internal/cheatpath"
"github.com/cheat/cheat/internal/mock"
)
// TestLoad asserts that sheets on valid cheatpaths can be loaded successfully
func TestLoad(t *testing.T) {
// mock cheatpaths
cheatpaths := []cheatpath.Cheatpath{
{
Name: "community",
Path: path.Join(mock.Path("cheatsheets"), "community"),
ReadOnly: true,
},
{
Name: "personal",
Path: path.Join(mock.Path("cheatsheets"), "personal"),
ReadOnly: false,
},
}
// load cheatsheets
sheets, err := Load(cheatpaths)
if err != nil {
t.Errorf("failed to load cheatsheets: %v", err)
}
// assert that the correct number of sheets loaded
// (sheet load details are tested in `sheet_test.go`)
want := 4
if len(sheets) != want {
t.Errorf(
"failed to load correct number of cheatsheets: want: %d, got: %d",
want,
len(sheets),
)
}
}
// TestLoadBadPath asserts that an error is returned if a cheatpath is invalid
func TestLoadBadPath(t *testing.T) {
// mock a bad cheatpath
cheatpaths := []cheatpath.Cheatpath{
{
Name: "badpath",
Path: "/cheat/test/path/does/not/exist",
ReadOnly: true,
},
}
// attempt to load the cheatpath
if _, err := Load(cheatpaths); err == nil {
t.Errorf("failed to reject invalid cheatpath")
}
}

View File

@ -1,4 +0,0 @@
---
tags: [ community ]
---
This is the bar cheatsheet.

View File

@ -1,4 +0,0 @@
---
tags: [ community ]
---
This is the foo cheatsheet.

View File

@ -1,4 +0,0 @@
---
tags: [ personal ]
---
This is the bat cheatsheet.

View File

@ -1,4 +0,0 @@
---
tags: [ personal ]
---
This is the baz cheatsheet.

View File

@ -1,4 +0,0 @@
---
syntax: sh
This is malformed frontmatter.

View File

@ -40,7 +40,8 @@ _cheat() {
'(-t --tag)'{-t,--tag}'[Return only sheets matching <tag>]: :->taglist' \ '(-t --tag)'{-t,--tag}'[Return only sheets matching <tag>]: :->taglist' \
'(-T --tags)'{-T,--tags}'[List all tags in use]: :->none' \ '(-T --tags)'{-T,--tags}'[List all tags in use]: :->none' \
'(-v --version)'{-v,--version}'[Print the version number]: :->none' \ '(-v --version)'{-v,--version}'[Print the version number]: :->none' \
'(--rm)--rm[Remove (delete) <sheet>]: :->personal' '(--rm)--rm[Remove (delete) <sheet>]: :->personal' \
'(-)*: :->full'
case $state in case $state in
(none) (none)
@ -62,4 +63,4 @@ _cheat() {
esac esac
} }
compdef _cheat cheat _cheat

View File

@ -1,46 +0,0 @@
#!/bin/sh -e
pull() {
for d in `cheat -d | awk '{print $2}'`;
do
echo "Update $d"
cd "$d"
[ -d ".git" ] && git pull || :
done
echo
echo "Finished update"
}
push() {
for d in `cheat -d | grep -v "community" | awk '{print $2}'`;
do
cd "$d"
if [ -d ".git" ]
then
echo "Push modifications $d"
files=$(git ls-files -mo | tr '\n' ' ')
git add -A && git commit -m "Edited files: $files" && git push || :
else
echo "$(pwd) is not a git managed folder"
echo "First connect this to your personal git repository"
fi
done
echo
echo "Finished push operation"
}
if [ "$1" = "pull" ]; then
pull
elif [ "$1" = "push" ]; then
push
else
echo "Usage:
# pull changes
cheatsheets pull
# push changes
cheatsheets push"
fi

View File

@ -25,17 +25,6 @@ linters:
- testpackage - testpackage
- godot - godot
- nestif - nestif
- paralleltest
- nlreturn
- cyclop
- exhaustivestruct
- gci
- gofumpt
- errorlint
- exhaustive
- ifshort
- wrapcheck
- stylecheck
linters-settings: linters-settings:
govet: govet:
@ -47,11 +36,6 @@ linters-settings:
goconst: goconst:
min-len: 8 min-len: 8
min-occurrences: 3 min-occurrences: 3
forbidigo:
forbid:
- (Must)?NewLexer
exclude_godoc_examples: false
issues: issues:
max-per-linter: 0 max-per-linter: 0

View File

@ -6,21 +6,17 @@ release:
brews: brews:
- -
install: bin.install "chroma" install: bin.install "chroma"
env:
- CGO_ENABLED=0
builds: builds:
- goos: - goos:
- linux - linux
- darwin - darwin
- windows - windows
goarch: goarch:
- arm64
- amd64 - amd64
- "386" - "386"
goarm: goarm:
- "6" - "6"
dir: ./cmd/chroma main: ./cmd/chroma/main.go
main: .
ldflags: -s -w -X main.version={{.Version}} -X main.commit={{.Commit}} -X main.date={{.Date}} ldflags: -s -w -X main.version={{.Version}} -X main.commit={{.Commit}} -X main.date={{.Date}}
binary: chroma binary: chroma
archives: archives:

12
vendor/github.com/alecthomas/chroma/.travis.yml generated vendored Normal file
View File

@ -0,0 +1,12 @@
sudo: false
language: go
go:
- "1.13.x"
script:
- go test -v ./...
- curl -sfL https://install.goreleaser.com/github.com/golangci/golangci-lint.sh | bash -s v1.26.0
- ./bin/golangci-lint run
- git clean -fdx .
after_success:
curl -sL https://git.io/goreleaser | bash && goreleaser

View File

@ -1,7 +1,5 @@
.PHONY: chromad upload all .PHONY: chromad upload all
VERSION ?= $(shell git describe --tags --dirty --always)
all: README.md tokentype_string.go all: README.md tokentype_string.go
README.md: lexers/*/*.go README.md: lexers/*/*.go
@ -11,8 +9,10 @@ tokentype_string.go: types.go
go generate go generate
chromad: chromad:
(cd ./cmd/chromad && go get github.com/GeertJohan/go.rice/rice@master && go install github.com/GeertJohan/go.rice/rice)
rm -f chromad rm -f chromad
(export CGOENABLED=0 GOOS=linux GOARCH=amd64; cd ./cmd/chromad && go build -ldflags="-X 'main.version=$(VERSION)'" -o ../../chromad .) (export CGOENABLED=0 GOOS=linux ; cd ./cmd/chromad && go build -o ../../chromad .)
rice append -i ./cmd/chromad --exec=./chromad
upload: chromad upload: chromad
scp chromad root@swapoff.org: && \ scp chromad root@swapoff.org: && \

View File

@ -1,5 +1,4 @@
# Chroma — A general purpose syntax highlighter in pure Go # Chroma — A general purpose syntax highlighter in pure Go [![Golang Documentation](https://godoc.org/github.com/alecthomas/chroma?status.svg)](https://godoc.org/github.com/alecthomas/chroma) [![Build Status](https://travis-ci.org/alecthomas/chroma.svg)](https://travis-ci.org/alecthomas/chroma) [![Gitter chat](https://badges.gitter.im/alecthomas.svg)](https://gitter.im/alecthomas/Lobby)
[![Golang Documentation](https://godoc.org/github.com/alecthomas/chroma?status.svg)](https://godoc.org/github.com/alecthomas/chroma) [![CI](https://github.com/alecthomas/chroma/actions/workflows/ci.yml/badge.svg)](https://github.com/alecthomas/chroma/actions/workflows/ci.yml) [![Slack chat](https://img.shields.io/static/v1?logo=slack&style=flat&label=slack&color=green&message=gophers)](https://invite.slack.golangbridge.org/)
> **NOTE:** As Chroma has just been released, its API is still in flux. That said, the high-level interface should not change significantly. > **NOTE:** As Chroma has just been released, its API is still in flux. That said, the high-level interface should not change significantly.
@ -37,30 +36,29 @@ translators for Pygments lexers and styles.
Prefix | Language Prefix | Language
:----: | -------- :----: | --------
A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Angular2, ANTLR, ApacheConf, APL, AppleScript, Arduino, Awk A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Angular2, ANTLR, ApacheConf, APL, AppleScript, Arduino, Awk
B | Ballerina, Base Makefile, Bash, Batchfile, BibTeX, Bicep, BlitzBasic, BNF, Brainfuck B | Ballerina, Base Makefile, Bash, Batchfile, BlitzBasic, BNF, Brainfuck
C | C, C#, C++, Caddyfile, Caddyfile Directives, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython C | C, C#, C++, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython
D | D, Dart, Diff, Django/Jinja, Docker, DTD, Dylan D | D, Dart, Diff, Django/Jinja, Docker, DTD
E | EBNF, Elixir, Elm, EmacsLisp, Erlang E | EBNF, Elixir, Elm, EmacsLisp, Erlang
F | Factor, Fish, Forth, Fortran, FSharp F | Factor, Fish, Forth, Fortran, FSharp
G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, Gherkin, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groff, Groovy G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groovy
H | Handlebars, Haskell, Haxe, HCL, Hexdump, HLB, HTML, HTTP, Hy H | Handlebars, Haskell, Haxe, HCL, Hexdump, HTML, HTTP, Hy
I | Idris, Igor, INI, Io I | Idris, INI, Io
J | J, Java, JavaScript, JSON, Julia, Jungle J | J, Java, JavaScript, JSON, Julia, Jungle
K | Kotlin K | Kotlin
L | Lighttpd configuration file, LLVM, Lua L | Lighttpd configuration file, LLVM, Lua
M | Mako, markdown, Mason, Mathematica, Matlab, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL M | Mako, markdown, Mason, Mathematica, Matlab, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL
N | NASM, Newspeak, Nginx configuration file, Nim, Nix N | NASM, Newspeak, Nginx configuration file, Nim, Nix
O | Objective-C, OCaml, Octave, OnesEnterprise, OpenEdge ABL, OpenSCAD, Org Mode O | Objective-C, OCaml, Octave, OpenSCAD, Org Mode
P | PacmanConf, Perl, PHP, PHTML, Pig, PkgConfig, PL/pgSQL, plaintext, Pony, PostgreSQL SQL dialect, PostScript, POVRay, PowerShell, Prolog, PromQL, Protocol Buffer, Puppet, Python 2, Python P | PacmanConf, Perl, PHP, Pig, PkgConfig, PL/pgSQL, plaintext, PostgreSQL SQL dialect, PostScript, POVRay, PowerShell, Prolog, Protocol Buffer, Puppet, Python, Python 3
Q | QBasic Q | QBasic
R | R, Racket, Ragel, Raku, react, ReasonML, reg, reStructuredText, Rexx, Ruby, Rust R | R, Racket, Ragel, react, reg, reStructuredText, Rexx, Ruby, Rust
S | SAS, Sass, Scala, Scheme, Scilab, SCSS, Smalltalk, Smarty, Snobol, Solidity, SPARQL, SQL, SquidConf, Standard ML, Stylus, Svelte, Swift, SYSTEMD, systemverilog S | Sass, Scala, Scheme, Scilab, SCSS, Smalltalk, Smarty, SML, Snobol, Solidity, SPARQL, SQL, SquidConf, Swift, SYSTEMD, systemverilog
T | TableGen, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData T | TableGen, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData
V | VB.net, verilog, VHDL, VimL, vue V | VB.net, verilog, VHDL, VimL, vue
W | WDTE W | WDTE
X | XML, Xorg X | XML, Xorg
Y | YAML, YANG Y | YAML
Z | Zig
_I will attempt to keep this section up to date, but an authoritative list can be _I will attempt to keep this section up to date, but an authoritative list can be
@ -185,7 +183,7 @@ following constructor options:
- `ClassPrefix(prefix)` - prefix each generated CSS class. - `ClassPrefix(prefix)` - prefix each generated CSS class.
- `TabWidth(width)` - Set the rendered tab width, in characters. - `TabWidth(width)` - Set the rendered tab width, in characters.
- `WithLineNumbers()` - Render line numbers (style with `LineNumbers`). - `WithLineNumbers()` - Render line numbers (style with `LineNumbers`).
- `LinkableLineNumbers()` - Make the line numbers linkable and be a link to themselves. - `LinkableLineNumbers()` - Make the line numbers linkable.
- `HighlightLines(ranges)` - Highlight lines in these ranges (style with `LineHighlight`). - `HighlightLines(ranges)` - Highlight lines in these ranges (style with `LineHighlight`).
- `LineNumbersInTable()` - Use a table for formatting line numbers and code, rather than spans. - `LineNumbersInTable()` - Use a table for formatting line numbers and code, rather than spans.
@ -211,13 +209,13 @@ using the included Python 3 script `pygments2chroma.py`. I use something like
the following: the following:
```sh ```sh
python3 _tools/pygments2chroma.py \ python3 ~/Projects/chroma/_tools/pygments2chroma.py \
pygments.lexers.jvm.KotlinLexer \ pygments.lexers.jvm.KotlinLexer \
> lexers/k/kotlin.go \ > ~/Projects/chroma/lexers/kotlin.go \
&& gofmt -s -w lexers/k/kotlin.go && gofmt -s -w ~/Projects/chroma/lexers/*.go
``` ```
See notes in [pygments-lexers.txt](https://github.com/alecthomas/chroma/blob/master/pygments-lexers.txt) See notes in [pygments-lexers.go](https://github.com/alecthomas/chroma/blob/master/pygments-lexers.txt)
for a list of lexers, and notes on some of the issues importing them. for a list of lexers, and notes on some of the issues importing them.
<a id="markdown-formatters" name="formatters"></a> <a id="markdown-formatters" name="formatters"></a>
@ -250,34 +248,18 @@ For a quick overview of the available styles and how they look, check out the [C
<a id="markdown-command-line-interface" name="command-line-interface"></a> <a id="markdown-command-line-interface" name="command-line-interface"></a>
## Command-line interface ## Command-line interface
A command-line interface to Chroma is included. A command-line interface to Chroma is included. It can be installed with:
Binaries are available to install from [the releases page](https://github.com/alecthomas/chroma/releases). ```sh
go get -u github.com/alecthomas/chroma/cmd/chroma
The CLI can be used as a preprocessor to colorise output of `less(1)`,
see documentation for the `LESSOPEN` environment variable.
The `--fail` flag can be used to suppress output and return with exit status
1 to facilitate falling back to some other preprocessor in case chroma
does not resolve a specific lexer to use for the given file. For example:
```shell
export LESSOPEN='| p() { chroma --fail "$1" || cat "$1"; }; p "%s"'
``` ```
Replace `cat` with your favourite fallback preprocessor.
When invoked as `.lessfilter`, the `--fail` flag is automatically turned
on under the hood for easy integration with [lesspipe shipping with
Debian and derivatives](https://manpages.debian.org/lesspipe#USER_DEFINED_FILTERS);
for that setup the `chroma` executable can be just symlinked to `~/.lessfilter`.
<a id="markdown-whats-missing-compared-to-pygments" name="whats-missing-compared-to-pygments"></a> <a id="markdown-whats-missing-compared-to-pygments" name="whats-missing-compared-to-pygments"></a>
## What's missing compared to Pygments? ## What's missing compared to Pygments?
- Quite a few lexers, for various reasons (pull-requests welcome): - Quite a few lexers, for various reasons (pull-requests welcome):
- Pygments lexers for complex languages often include custom code to - Pygments lexers for complex languages often include custom code to
handle certain aspects, such as Raku's ability to nest code inside handle certain aspects, such as Perl6's ability to nest code inside
regular expressions. These require time and effort to convert. regular expressions. These require time and effort to convert.
- I mostly only converted languages I had heard of, to reduce the porting cost. - I mostly only converted languages I had heard of, to reduce the porting cost.
- Some more esoteric features of Pygments are omitted for simplicity. - Some more esoteric features of Pygments are omitted for simplicity.

View File

@ -46,13 +46,6 @@ func WithPreWrapper(wrapper PreWrapper) Option {
} }
} }
// WrapLongLines wraps long lines.
func WrapLongLines(b bool) Option {
return func(f *Formatter) {
f.wrapLongLines = b
}
}
// WithLineNumbers formats output with line numbers. // WithLineNumbers formats output with line numbers.
func WithLineNumbers(b bool) Option { func WithLineNumbers(b bool) Option {
return func(f *Formatter) { return func(f *Formatter) {
@ -138,18 +131,10 @@ var (
} }
defaultPreWrapper = preWrapper{ defaultPreWrapper = preWrapper{
start: func(code bool, styleAttr string) string { start: func(code bool, styleAttr string) string {
if code { return fmt.Sprintf("<pre%s>", styleAttr)
return fmt.Sprintf(`<pre tabindex="0"%s><code>`, styleAttr)
}
return fmt.Sprintf(`<pre tabindex="0"%s>`, styleAttr)
}, },
end: func(code bool) string { end: func(code bool) string {
if code { return "</pre>"
return `</code></pre>`
}
return `</pre>`
}, },
} }
) )
@ -162,7 +147,6 @@ type Formatter struct {
allClasses bool allClasses bool
preWrapper PreWrapper preWrapper PreWrapper
tabWidth int tabWidth int
wrapLongLines bool
lineNumbers bool lineNumbers bool
lineNumbersInTable bool lineNumbersInTable bool
linkableLineNumbers bool linkableLineNumbers bool
@ -213,10 +197,10 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
if wrapInTable { if wrapInTable {
// List line numbers in its own <td> // List line numbers in its own <td>
fmt.Fprintf(w, "<div%s>\n", f.styleAttr(css, chroma.PreWrapper)) fmt.Fprintf(w, "<div%s>\n", f.styleAttr(css, chroma.Background))
fmt.Fprintf(w, "<table%s><tr>", f.styleAttr(css, chroma.LineTable)) fmt.Fprintf(w, "<table%s><tr>", f.styleAttr(css, chroma.LineTable))
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD)) fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD))
fmt.Fprintf(w, f.preWrapper.Start(false, f.styleAttr(css, chroma.PreWrapper))) fmt.Fprintf(w, f.preWrapper.Start(false, f.styleAttr(css, chroma.Background)))
for index := range lines { for index := range lines {
line := f.baseLineNumber + index line := f.baseLineNumber + index
highlight, next := f.shouldHighlight(highlightIndex, line) highlight, next := f.shouldHighlight(highlightIndex, line)
@ -227,7 +211,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight)) fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight))
} }
fmt.Fprintf(w, "<span%s%s>%s\n</span>", f.styleAttr(css, chroma.LineNumbersTable), f.lineIDAttribute(line), f.lineTitleWithLinkIfNeeded(lineDigits, line)) fmt.Fprintf(w, "<span%s%s>%*d\n</span>", f.styleAttr(css, chroma.LineNumbersTable), f.lineIDAttribute(line), lineDigits, line)
if highlight { if highlight {
fmt.Fprintf(w, "</span>") fmt.Fprintf(w, "</span>")
@ -238,7 +222,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD, "width:100%")) fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD, "width:100%"))
} }
fmt.Fprintf(w, f.preWrapper.Start(true, f.styleAttr(css, chroma.PreWrapper))) fmt.Fprintf(w, f.preWrapper.Start(true, f.styleAttr(css, chroma.Background)))
highlightIndex = 0 highlightIndex = 0
for index, tokens := range lines { for index, tokens := range lines {
@ -248,28 +232,14 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
if next { if next {
highlightIndex++ highlightIndex++
} }
// Start of Line
fmt.Fprint(w, `<span`)
if highlight { if highlight {
// Line + LineHighlight fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight))
if f.Classes {
fmt.Fprintf(w, ` class="%s %s"`, f.class(chroma.Line), f.class(chroma.LineHighlight))
} else {
fmt.Fprintf(w, ` style="%s %s"`, css[chroma.Line], css[chroma.LineHighlight])
}
fmt.Fprint(w, `>`)
} else {
fmt.Fprintf(w, "%s>", f.styleAttr(css, chroma.Line))
} }
// Line number
if f.lineNumbers && !wrapInTable { if f.lineNumbers && !wrapInTable {
fmt.Fprintf(w, "<span%s%s>%s</span>", f.styleAttr(css, chroma.LineNumbers), f.lineIDAttribute(line), f.lineTitleWithLinkIfNeeded(lineDigits, line)) fmt.Fprintf(w, "<span%s%s>%*d</span>", f.styleAttr(css, chroma.LineNumbers), f.lineIDAttribute(line), lineDigits, line)
} }
fmt.Fprintf(w, `<span%s>`, f.styleAttr(css, chroma.CodeLine))
for _, token := range tokens { for _, token := range tokens {
html := html.EscapeString(token.String()) html := html.EscapeString(token.String())
attr := f.styleAttr(css, token.Type) attr := f.styleAttr(css, token.Type)
@ -278,10 +248,9 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
} }
fmt.Fprint(w, html) fmt.Fprint(w, html)
} }
if highlight {
fmt.Fprint(w, `</span>`) // End of CodeLine fmt.Fprintf(w, "</span>")
}
fmt.Fprint(w, `</span>`) // End of Line
} }
fmt.Fprintf(w, f.preWrapper.End(true)) fmt.Fprintf(w, f.preWrapper.End(true))
@ -303,19 +272,7 @@ func (f *Formatter) lineIDAttribute(line int) string {
if !f.linkableLineNumbers { if !f.linkableLineNumbers {
return "" return ""
} }
return fmt.Sprintf(" id=\"%s\"", f.lineID(line)) return fmt.Sprintf(" id=\"%s%d\"", f.lineNumbersIDPrefix, line)
}
func (f *Formatter) lineTitleWithLinkIfNeeded(lineDigits, line int) string {
title := fmt.Sprintf("%*d", lineDigits, line)
if !f.linkableLineNumbers {
return title
}
return fmt.Sprintf("<a style=\"outline: none; text-decoration:none; color:inherit\" href=\"#%s\">%s</a>", f.lineID(line), title)
}
func (f *Formatter) lineID(line int) string {
return fmt.Sprintf("%s%d", f.lineNumbersIDPrefix, line)
} }
func (f *Formatter) shouldHighlight(highlightIndex, line int) (bool, bool) { func (f *Formatter) shouldHighlight(highlightIndex, line int) (bool, bool) {
@ -382,11 +339,7 @@ func (f *Formatter) tabWidthStyle() string {
func (f *Formatter) WriteCSS(w io.Writer, style *chroma.Style) error { func (f *Formatter) WriteCSS(w io.Writer, style *chroma.Style) error {
css := f.styleToCSS(style) css := f.styleToCSS(style)
// Special-case background as it is mapped to the outer ".chroma" class. // Special-case background as it is mapped to the outer ".chroma" class.
if _, err := fmt.Fprintf(w, "/* %s */ .%sbg { %s }\n", chroma.Background, f.prefix, css[chroma.Background]); err != nil { if _, err := fmt.Fprintf(w, "/* %s */ .%schroma { %s }\n", chroma.Background, f.prefix, css[chroma.Background]); err != nil {
return err
}
// Special-case PreWrapper as it is the ".chroma" class.
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma { %s }\n", chroma.PreWrapper, f.prefix, css[chroma.PreWrapper]); err != nil {
return err return err
} }
// Special-case code column of table to expand width. // Special-case code column of table to expand width.
@ -410,8 +363,7 @@ func (f *Formatter) WriteCSS(w io.Writer, style *chroma.Style) error {
sort.Ints(tts) sort.Ints(tts)
for _, ti := range tts { for _, ti := range tts {
tt := chroma.TokenType(ti) tt := chroma.TokenType(ti)
switch tt { if tt == chroma.Background {
case chroma.Background, chroma.PreWrapper:
continue continue
} }
class := f.class(tt) class := f.class(tt)
@ -441,21 +393,12 @@ func (f *Formatter) styleToCSS(style *chroma.Style) map[chroma.TokenType]string
classes[t] = StyleEntryToCSS(entry) classes[t] = StyleEntryToCSS(entry)
} }
classes[chroma.Background] += f.tabWidthStyle() classes[chroma.Background] += f.tabWidthStyle()
classes[chroma.PreWrapper] += classes[chroma.Background] + `;` lineNumbersStyle := "margin-right: 0.4em; padding: 0 0.4em 0 0.4em;"
// Make PreWrapper a grid to show highlight style with full width.
if len(f.highlightRanges) > 0 {
classes[chroma.PreWrapper] += `display: grid;`
}
// Make PreWrapper wrap long lines.
if f.wrapLongLines {
classes[chroma.PreWrapper] += `white-space: pre-wrap; word-break: break-word;`
}
lineNumbersStyle := `white-space: pre; user-select: none; margin-right: 0.4em; padding: 0 0.4em 0 0.4em;`
// All rules begin with default rules followed by user provided rules // All rules begin with default rules followed by user provided rules
classes[chroma.Line] = `display: flex;` + classes[chroma.Line]
classes[chroma.LineNumbers] = lineNumbersStyle + classes[chroma.LineNumbers] classes[chroma.LineNumbers] = lineNumbersStyle + classes[chroma.LineNumbers]
classes[chroma.LineNumbersTable] = lineNumbersStyle + classes[chroma.LineNumbersTable] classes[chroma.LineNumbersTable] = lineNumbersStyle + classes[chroma.LineNumbersTable]
classes[chroma.LineTable] = "border-spacing: 0; padding: 0; margin: 0; border: 0;" + classes[chroma.LineTable] classes[chroma.LineHighlight] = "display: block; width: 100%;" + classes[chroma.LineHighlight]
classes[chroma.LineTable] = "border-spacing: 0; padding: 0; margin: 0; border: 0; width: auto; overflow: auto; display: block;" + classes[chroma.LineTable]
classes[chroma.LineTableTD] = "vertical-align: top; padding: 0; margin: 0; border: 0;" + classes[chroma.LineTableTD] classes[chroma.LineTableTD] = "vertical-align: top; padding: 0; margin: 0; border: 0;" + classes[chroma.LineTableTD]
return classes return classes
} }

View File

@ -120,7 +120,7 @@ func maxLineWidth(lines [][]chroma.Token) int {
for _, tokens := range lines { for _, tokens := range lines {
length := 0 length := 0
for _, token := range tokens { for _, token := range tokens {
length += len(strings.ReplaceAll(token.String(), ` `, " ")) length += len(strings.Replace(token.String(), ` `, " ", -1))
} }
if length > maxWidth { if length > maxWidth {
maxWidth = length maxWidth = length
@ -136,7 +136,7 @@ func (f *Formatter) writeTokenBackgrounds(w io.Writer, lines [][]chroma.Token, s
for index, tokens := range lines { for index, tokens := range lines {
lineLength := 0 lineLength := 0
for _, token := range tokens { for _, token := range tokens {
length := len(strings.ReplaceAll(token.String(), ` `, " ")) length := len(strings.Replace(token.String(), ` `, " ", -1))
tokenBackground := style.Get(token.Type).Background tokenBackground := style.Get(token.Type).Background
if tokenBackground.IsSet() && tokenBackground != style.Get(chroma.Background).Background { if tokenBackground.IsSet() && tokenBackground != style.Get(chroma.Background).Background {
fmt.Fprintf(w, "<rect id=\"%s\" x=\"%dch\" y=\"%fem\" width=\"%dch\" height=\"1.2em\" fill=\"%s\" />\n", escapeString(token.String()), lineLength, 1.2*float64(index)+0.25, length, style.Get(token.Type).Background.String()) fmt.Fprintf(w, "<rect id=\"%s\" x=\"%dch\" y=\"%fem\" width=\"%dch\" height=\"1.2em\" fill=\"%s\" />\n", escapeString(token.String()), lineLength, 1.2*float64(index)+0.25, length, style.Get(token.Type).Background.String())

View File

@ -17,20 +17,6 @@ var c = chroma.MustParseColour
var ttyTables = map[int]*ttyTable{ var ttyTables = map[int]*ttyTable{
8: { 8: {
foreground: map[chroma.Colour]string{
c("#000000"): "\033[30m", c("#7f0000"): "\033[31m", c("#007f00"): "\033[32m", c("#7f7fe0"): "\033[33m",
c("#00007f"): "\033[34m", c("#7f007f"): "\033[35m", c("#007f7f"): "\033[36m", c("#e5e5e5"): "\033[37m",
c("#555555"): "\033[1m\033[30m", c("#ff0000"): "\033[1m\033[31m", c("#00ff00"): "\033[1m\033[32m", c("#ffff00"): "\033[1m\033[33m",
c("#0000ff"): "\033[1m\033[34m", c("#ff00ff"): "\033[1m\033[35m", c("#00ffff"): "\033[1m\033[36m", c("#ffffff"): "\033[1m\033[37m",
},
background: map[chroma.Colour]string{
c("#000000"): "\033[40m", c("#7f0000"): "\033[41m", c("#007f00"): "\033[42m", c("#7f7fe0"): "\033[43m",
c("#00007f"): "\033[44m", c("#7f007f"): "\033[45m", c("#007f7f"): "\033[46m", c("#e5e5e5"): "\033[47m",
c("#555555"): "\033[1m\033[40m", c("#ff0000"): "\033[1m\033[41m", c("#00ff00"): "\033[1m\033[42m", c("#ffff00"): "\033[1m\033[43m",
c("#0000ff"): "\033[1m\033[44m", c("#ff00ff"): "\033[1m\033[45m", c("#00ffff"): "\033[1m\033[46m", c("#ffffff"): "\033[1m\033[47m",
},
},
16: {
foreground: map[chroma.Colour]string{ foreground: map[chroma.Colour]string{
c("#000000"): "\033[30m", c("#7f0000"): "\033[31m", c("#007f00"): "\033[32m", c("#7f7fe0"): "\033[33m", c("#000000"): "\033[30m", c("#7f0000"): "\033[31m", c("#007f00"): "\033[32m", c("#7f7fe0"): "\033[33m",
c("#00007f"): "\033[34m", c("#7f007f"): "\033[35m", c("#007f7f"): "\033[36m", c("#e5e5e5"): "\033[37m", c("#00007f"): "\033[34m", c("#7f007f"): "\033[35m", c("#007f7f"): "\033[36m", c("#e5e5e5"): "\033[37m",
@ -241,11 +227,15 @@ type indexedTTYFormatter struct {
func (c *indexedTTYFormatter) Format(w io.Writer, style *chroma.Style, it chroma.Iterator) (err error) { func (c *indexedTTYFormatter) Format(w io.Writer, style *chroma.Style, it chroma.Iterator) (err error) {
theme := styleToEscapeSequence(c.table, style) theme := styleToEscapeSequence(c.table, style)
for token := it(); token != chroma.EOF; token = it() { for token := it(); token != chroma.EOF; token = it() {
// TODO: Cache token lookups?
clr, ok := theme[token.Type] clr, ok := theme[token.Type]
if !ok { if !ok {
clr, ok = theme[token.Type.SubCategory()] clr, ok = theme[token.Type.SubCategory()]
if !ok { if !ok {
clr = theme[token.Type.Category()] clr = theme[token.Type.Category()]
// if !ok {
// clr = theme[chroma.InheritStyle]
// }
} }
} }
if clr != "" { if clr != "" {
@ -259,22 +249,10 @@ func (c *indexedTTYFormatter) Format(w io.Writer, style *chroma.Style, it chroma
return nil return nil
} }
// TTY is an 8-colour terminal formatter.
//
// The Lab colour space is used to map RGB values to the most appropriate index colour.
var TTY = Register("terminal", &indexedTTYFormatter{ttyTables[8]})
// TTY8 is an 8-colour terminal formatter. // TTY8 is an 8-colour terminal formatter.
// //
// The Lab colour space is used to map RGB values to the most appropriate index colour. // The Lab colour space is used to map RGB values to the most appropriate index colour.
var TTY8 = Register("terminal8", &indexedTTYFormatter{ttyTables[8]}) var TTY8 = Register("terminal", &indexedTTYFormatter{ttyTables[8]})
// TTY16 is a 16-colour terminal formatter.
//
// It uses \033[3xm for normal colours and \033[90Xm for bright colours.
//
// The Lab colour space is used to map RGB values to the most appropriate index colour.
var TTY16 = Register("terminal16", &indexedTTYFormatter{ttyTables[16]})
// TTY256 is a 256-colour terminal formatter. // TTY256 is a 256-colour terminal formatter.
// //

View File

@ -3,7 +3,16 @@ module github.com/alecthomas/chroma
go 1.13 go 1.13
require ( require (
github.com/davecgh/go-spew v1.1.1 // indirect github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38
github.com/dlclark/regexp2 v1.4.0 github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 // indirect
github.com/stretchr/testify v1.7.0 github.com/alecthomas/kong v0.2.4
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 // indirect
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964
github.com/dlclark/regexp2 v1.2.0
github.com/mattn/go-colorable v0.1.6
github.com/mattn/go-isatty v0.0.12
github.com/pkg/errors v0.9.1 // indirect
github.com/sergi/go-diff v1.0.0 // indirect
github.com/stretchr/testify v1.3.0 // indirect
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 // indirect
) )

View File

@ -1,14 +1,36 @@
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
github.com/alecthomas/kong v0.2.4 h1:Y0ZBCHAvHhTHw7FFJ2FzCAAG4pkbTgA45nc7BpMhDNk=
github.com/alecthomas/kong v0.2.4/go.mod h1:kQOmtJgV+Lb4aj+I2LEn40cbtawdWJ9Y8QLq+lElKxE=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c= github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dlclark/regexp2 v1.4.0 h1:F1rxgk7p4uKjwIQxBs9oAXe5CqrXlCduYEJvrF4u93E= github.com/dlclark/regexp2 v1.2.0 h1:8sAhBGEM0dRWogWqWyQeIJnxjWO6oIjl8FKqREDsGfk=
github.com/dlclark/regexp2 v1.4.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc= github.com/dlclark/regexp2 v1.2.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
github.com/mattn/go-colorable v0.1.6 h1:6Su7aK7lXmJ/U79bYtBjLNaha4Fs1Rg9plHpcH+vvnE=
github.com/mattn/go-colorable v0.1.6/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
github.com/mattn/go-isatty v0.0.12 h1:wuysRhFDzyxgEmMf5xjvJ2M9dZoWAXNNr5LSBS7uHXY=
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.7.0 h1:nwc3DEeHmmLAfoZucVR881uASk0Mfjw8xYJ99tb5CcY= github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM= github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo= golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= golang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 h1:opSr2sbRXk5X5/givKrrKj9HXxFpW2sdCiP8MJSKLQY=
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=

View File

@ -4,7 +4,7 @@ import "strings"
// An Iterator across tokens. // An Iterator across tokens.
// //
// EOF will be returned at the end of the Token stream. // nil will be returned at the end of the Token stream.
// //
// If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover. // If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover.
type Iterator func() Token type Iterator func() Token

View File

@ -2,7 +2,6 @@ package chroma
import ( import (
"fmt" "fmt"
"strings"
) )
var ( var (
@ -99,11 +98,9 @@ type Lexer interface {
// Lexers is a slice of lexers sortable by name. // Lexers is a slice of lexers sortable by name.
type Lexers []Lexer type Lexers []Lexer
func (l Lexers) Len() int { return len(l) } func (l Lexers) Len() int { return len(l) }
func (l Lexers) Swap(i, j int) { l[i], l[j] = l[j], l[i] } func (l Lexers) Swap(i, j int) { l[i], l[j] = l[j], l[i] }
func (l Lexers) Less(i, j int) bool { func (l Lexers) Less(i, j int) bool { return l[i].Config().Name < l[j].Config().Name }
return strings.ToLower(l[i].Config().Name) < strings.ToLower(l[j].Config().Name)
}
// PrioritisedLexers is a slice of lexers sortable by priority. // PrioritisedLexers is a slice of lexers sortable by priority.
type PrioritisedLexers []Lexer type PrioritisedLexers []Lexer

View File

@ -3,9 +3,6 @@
The tests in this directory feed a known input `testdata/<name>.actual` into the parser for `<name>` and check The tests in this directory feed a known input `testdata/<name>.actual` into the parser for `<name>` and check
that its output matches `<name>.exported`. that its output matches `<name>.exported`.
It is also possible to perform several tests on a same parser `<name>`, by placing know inputs `*.actual` into a
directory `testdata/<name>/`.
## Running the tests ## Running the tests
Run the tests as normal: Run the tests as normal:

View File

@ -6,7 +6,7 @@ import (
) )
// ABAP lexer. // ABAP lexer.
var Abap = internal.Register(MustNewLazyLexer( var Abap = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "ABAP", Name: "ABAP",
Aliases: []string{"abap"}, Aliases: []string{"abap"},
@ -14,11 +14,7 @@ var Abap = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"text/x-abap"}, MimeTypes: []string{"text/x-abap"},
CaseInsensitive: true, CaseInsensitive: true,
}, },
abapRules, Rules{
))
func abapRules() Rules {
return Rules{
"common": { "common": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`^\*.*$`, CommentSingle, nil}, {`^\*.*$`, CommentSingle, nil},
@ -56,5 +52,5 @@ func abapRules() Rules {
{`[/;:()\[\],.]`, Punctuation, nil}, {`[/;:()\[\],.]`, Punctuation, nil},
{`(!)(\w+)`, ByGroups(Operator, Name), nil}, {`(!)(\w+)`, ByGroups(Operator, Name), nil},
}, },
} },
} ))

View File

@ -6,18 +6,14 @@ import (
) )
// Abnf lexer. // Abnf lexer.
var Abnf = internal.Register(MustNewLazyLexer( var Abnf = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "ABNF", Name: "ABNF",
Aliases: []string{"abnf"}, Aliases: []string{"abnf"},
Filenames: []string{"*.abnf"}, Filenames: []string{"*.abnf"},
MimeTypes: []string{"text/x-abnf"}, MimeTypes: []string{"text/x-abnf"},
}, },
abnfRules, Rules{
))
func abnfRules() Rules {
return Rules{
"root": { "root": {
{`;.*$`, CommentSingle, nil}, {`;.*$`, CommentSingle, nil},
{`(%[si])?"[^"]*"`, Literal, nil}, {`(%[si])?"[^"]*"`, Literal, nil},
@ -38,5 +34,5 @@ func abnfRules() Rules {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`.`, Text, nil}, {`.`, Text, nil},
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Actionscript lexer. // Actionscript lexer.
var Actionscript = internal.Register(MustNewLazyLexer( var Actionscript = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "ActionScript", Name: "ActionScript",
Aliases: []string{"as", "actionscript"}, Aliases: []string{"as", "actionscript"},
@ -15,11 +15,7 @@ var Actionscript = internal.Register(MustNewLazyLexer(
NotMultiline: true, NotMultiline: true,
DotAll: true, DotAll: true,
}, },
actionscriptRules, Rules{
))
func actionscriptRules() Rules {
return Rules{
"root": { "root": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`//.*?\n`, CommentSingle, nil}, {`//.*?\n`, CommentSingle, nil},
@ -39,5 +35,5 @@ func actionscriptRules() Rules {
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil}, {`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil}, {`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Actionscript 3 lexer. // Actionscript 3 lexer.
var Actionscript3 = internal.Register(MustNewLazyLexer( var Actionscript3 = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "ActionScript 3", Name: "ActionScript 3",
Aliases: []string{"as3", "actionscript3"}, Aliases: []string{"as3", "actionscript3"},
@ -14,11 +14,7 @@ var Actionscript3 = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"application/x-actionscript3", "text/x-actionscript3", "text/actionscript3"}, MimeTypes: []string{"application/x-actionscript3", "text/x-actionscript3", "text/actionscript3"},
DotAll: true, DotAll: true,
}, },
actionscript3Rules, Rules{
))
func actionscript3Rules() Rules {
return Rules{
"root": { "root": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`(function\s+)([$a-zA-Z_]\w*)(\s*)(\()`, ByGroups(KeywordDeclaration, NameFunction, Text, Operator), Push("funcparams")}, {`(function\s+)([$a-zA-Z_]\w*)(\s*)(\()`, ByGroups(KeywordDeclaration, NameFunction, Text, Operator), Push("funcparams")},
@ -56,5 +52,5 @@ func actionscript3Rules() Rules {
{`,`, Operator, Pop(1)}, {`,`, Operator, Pop(1)},
Default(Pop(1)), Default(Pop(1)),
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Ada lexer. // Ada lexer.
var Ada = internal.Register(MustNewLazyLexer( var Ada = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Ada", Name: "Ada",
Aliases: []string{"ada", "ada95", "ada2005"}, Aliases: []string{"ada", "ada95", "ada2005"},
@ -14,11 +14,7 @@ var Ada = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"text/x-ada"}, MimeTypes: []string{"text/x-ada"},
CaseInsensitive: true, CaseInsensitive: true,
}, },
adaRules, Rules{
))
func adaRules() Rules {
return Rules{
"root": { "root": {
{`[^\S\n]+`, Text, nil}, {`[^\S\n]+`, Text, nil},
{`--.*?\n`, CommentSingle, nil}, {`--.*?\n`, CommentSingle, nil},
@ -114,5 +110,5 @@ func adaRules() Rules {
{`\)`, Punctuation, Pop(1)}, {`\)`, Punctuation, Pop(1)},
Include("root"), Include("root"),
}, },
} },
} ))

View File

@ -1,47 +0,0 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Al lexer.
var Al = internal.Register(MustNewLazyLexer(
&Config{
Name: "AL",
Aliases: []string{"al"},
Filenames: []string{"*.al", "*.dal"},
MimeTypes: []string{"text/x-al"},
DotAll: true,
CaseInsensitive: true,
},
alRules,
))
// https://github.com/microsoft/AL/blob/master/grammar/alsyntax.tmlanguage
func alRules() Rules {
return Rules{
"root": {
{`\s+`, TextWhitespace, nil},
{`(?s)\/\*.*?\\*\*\/`, CommentMultiline, nil},
{`(?s)//.*?\n`, CommentSingle, nil},
{`\"([^\"])*\"`, Text, nil},
{`'([^'])*'`, LiteralString, nil},
{`\b(?i:(ARRAY|ASSERTERROR|BEGIN|BREAK|CASE|DO|DOWNTO|ELSE|END|EVENT|EXIT|FOR|FOREACH|FUNCTION|IF|IMPLEMENTS|IN|INDATASET|INTERFACE|INTERNAL|LOCAL|OF|PROCEDURE|PROGRAM|PROTECTED|REPEAT|RUNONCLIENT|SECURITYFILTERING|SUPPRESSDISPOSE|TEMPORARY|THEN|TO|TRIGGER|UNTIL|VAR|WHILE|WITH|WITHEVENTS))\b`, Keyword, nil},
{`\b(?i:(AND|DIV|MOD|NOT|OR|XOR))\b`, OperatorWord, nil},
{`\b(?i:(AVERAGE|CONST|COUNT|EXIST|FIELD|FILTER|LOOKUP|MAX|MIN|ORDER|SORTING|SUM|TABLEDATA|UPPERLIMIT|WHERE|ASCENDING|DESCENDING))\b`, Keyword, nil},
{`\b(?i:(CODEUNIT|PAGE|PAGEEXTENSION|PAGECUSTOMIZATION|DOTNET|ENUM|ENUMEXTENSION|VALUE|QUERY|REPORT|TABLE|TABLEEXTENSION|XMLPORT|PROFILE|CONTROLADDIN|REPORTEXTENSION|INTERFACE|PERMISSIONSET|PERMISSIONSETEXTENSION|ENTITLEMENT))\b`, Keyword, nil},
{`\b(?i:(Action|Array|Automation|BigInteger|BigText|Blob|Boolean|Byte|Char|ClientType|Code|Codeunit|CompletionTriggerErrorLevel|ConnectionType|Database|DataClassification|DataScope|Date|DateFormula|DateTime|Decimal|DefaultLayout|Dialog|Dictionary|DotNet|DotNetAssembly|DotNetTypeDeclaration|Duration|Enum|ErrorInfo|ErrorType|ExecutionContext|ExecutionMode|FieldClass|FieldRef|FieldType|File|FilterPageBuilder|Guid|InStream|Integer|Joker|KeyRef|List|ModuleDependencyInfo|ModuleInfo|None|Notification|NotificationScope|ObjectType|Option|OutStream|Page|PageResult|Query|Record|RecordId|RecordRef|Report|ReportFormat|SecurityFilter|SecurityFiltering|Table|TableConnectionType|TableFilter|TestAction|TestField|TestFilterField|TestPage|TestPermissions|TestRequestPage|Text|TextBuilder|TextConst|TextEncoding|Time|TransactionModel|TransactionType|Variant|Verbosity|Version|XmlPort|HttpContent|HttpHeaders|HttpClient|HttpRequestMessage|HttpResponseMessage|JsonToken|JsonValue|JsonArray|JsonObject|View|Views|XmlAttribute|XmlAttributeCollection|XmlComment|XmlCData|XmlDeclaration|XmlDocument|XmlDocumentType|XmlElement|XmlNamespaceManager|XmlNameTable|XmlNode|XmlNodeList|XmlProcessingInstruction|XmlReadOptions|XmlText|XmlWriteOptions|WebServiceActionContext|WebServiceActionResultCode|SessionSettings))\b`, Keyword, nil},
{`\b([<>]=|<>|<|>)\b?`, Operator, nil},
{`\b(\-|\+|\/|\*)\b`, Operator, nil},
{`\s*(\:=|\+=|-=|\/=|\*=)\s*?`, Operator, nil},
{`\b(?i:(ADD|ADDFIRST|ADDLAST|ADDAFTER|ADDBEFORE|ACTION|ACTIONS|AREA|ASSEMBLY|CHARTPART|CUEGROUP|CUSTOMIZES|COLUMN|DATAITEM|DATASET|ELEMENTS|EXTENDS|FIELD|FIELDGROUP|FIELDATTRIBUTE|FIELDELEMENT|FIELDGROUPS|FIELDS|FILTER|FIXED|GRID|GROUP|MOVEAFTER|MOVEBEFORE|KEY|KEYS|LABEL|LABELS|LAYOUT|MODIFY|MOVEFIRST|MOVELAST|MOVEBEFORE|MOVEAFTER|PART|REPEATER|USERCONTROL|REQUESTPAGE|SCHEMA|SEPARATOR|SYSTEMPART|TABLEELEMENT|TEXTATTRIBUTE|TEXTELEMENT|TYPE))\b`, Keyword, nil},
{`\s*[(\.\.)&\|]\s*`, Operator, nil},
{`\b((0(x|X)[0-9a-fA-F]*)|(([0-9]+\.?[0-9]*)|(\.[0-9]+))((e|E)(\+|-)?[0-9]+)?)(L|l|UL|ul|u|U|F|f|ll|LL|ull|ULL)?\b`, LiteralNumber, nil},
{`[;:,]`, Punctuation, nil},
{`#[ \t]*(if|else|elif|endif|define|undef|region|endregion|pragma)\b.*?\n`, CommentPreproc, nil},
{`\w+`, Text, nil},
{`.`, Text, nil},
},
}
}

View File

@ -6,18 +6,14 @@ import (
) )
// Angular2 lexer. // Angular2 lexer.
var Angular2 = internal.Register(MustNewLazyLexer( var Angular2 = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Angular2", Name: "Angular2",
Aliases: []string{"ng2"}, Aliases: []string{"ng2"},
Filenames: []string{}, Filenames: []string{},
MimeTypes: []string{}, MimeTypes: []string{},
}, },
angular2Rules, Rules{
))
func angular2Rules() Rules {
return Rules{
"root": { "root": {
{`[^{([*#]+`, Other, nil}, {`[^{([*#]+`, Other, nil},
{`(\{\{)(\s*)`, ByGroups(CommentPreproc, Text), Push("ngExpression")}, {`(\{\{)(\s*)`, ByGroups(CommentPreproc, Text), Push("ngExpression")},
@ -42,5 +38,5 @@ func angular2Rules() Rules {
{`'.*?'`, LiteralString, Pop(1)}, {`'.*?'`, LiteralString, Pop(1)},
{`[^\s>]+`, LiteralString, Pop(1)}, {`[^\s>]+`, LiteralString, Pop(1)},
}, },
} },
} ))

View File

@ -6,18 +6,14 @@ import (
) )
// ANTLR lexer. // ANTLR lexer.
var ANTLR = internal.Register(MustNewLazyLexer( var ANTLR = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "ANTLR", Name: "ANTLR",
Aliases: []string{"antlr"}, Aliases: []string{"antlr"},
Filenames: []string{}, Filenames: []string{},
MimeTypes: []string{}, MimeTypes: []string{},
}, },
antlrRules, Rules{
))
func antlrRules() Rules {
return Rules{
"whitespace": { "whitespace": {
{`\s+`, TextWhitespace, nil}, {`\s+`, TextWhitespace, nil},
}, },
@ -101,5 +97,5 @@ func antlrRules() Rules {
{`(\$[a-zA-Z]+)(\.?)(text|value)?`, ByGroups(NameVariable, Punctuation, NameProperty), nil}, {`(\$[a-zA-Z]+)(\.?)(text|value)?`, ByGroups(NameVariable, Punctuation, NameProperty), nil},
{`(\\\\|\\\]|\\\[|[^\[\]])+`, Other, nil}, {`(\\\\|\\\]|\\\[|[^\[\]])+`, Other, nil},
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Apacheconf lexer. // Apacheconf lexer.
var Apacheconf = internal.Register(MustNewLazyLexer( var Apacheconf = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "ApacheConf", Name: "ApacheConf",
Aliases: []string{"apacheconf", "aconf", "apache"}, Aliases: []string{"apacheconf", "aconf", "apache"},
@ -14,11 +14,7 @@ var Apacheconf = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"text/x-apacheconf"}, MimeTypes: []string{"text/x-apacheconf"},
CaseInsensitive: true, CaseInsensitive: true,
}, },
apacheconfRules, Rules{
))
func apacheconfRules() Rules {
return Rules{
"root": { "root": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`(#.*?)$`, Comment, nil}, {`(#.*?)$`, Comment, nil},
@ -38,5 +34,5 @@ func apacheconfRules() Rules {
{`"([^"\\]*(?:\\.[^"\\]*)*)"`, LiteralStringDouble, nil}, {`"([^"\\]*(?:\\.[^"\\]*)*)"`, LiteralStringDouble, nil},
{`[^\s"\\]+`, Text, nil}, {`[^\s"\\]+`, Text, nil},
}, },
} },
} ))

View File

@ -6,18 +6,14 @@ import (
) )
// Apl lexer. // Apl lexer.
var Apl = internal.Register(MustNewLazyLexer( var Apl = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "APL", Name: "APL",
Aliases: []string{"apl"}, Aliases: []string{"apl"},
Filenames: []string{"*.apl"}, Filenames: []string{"*.apl"},
MimeTypes: []string{}, MimeTypes: []string{},
}, },
aplRules, Rules{
))
func aplRules() Rules {
return Rules{
"root": { "root": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`[⍝#].*$`, CommentSingle, nil}, {`[⍝#].*$`, CommentSingle, nil},
@ -26,15 +22,15 @@ func aplRules() Rules {
{`[⋄◇()]`, Punctuation, nil}, {`[⋄◇()]`, Punctuation, nil},
{`[\[\];]`, LiteralStringRegex, nil}, {`[\[\];]`, LiteralStringRegex, nil},
{`⎕[A-Za-zΔ∆⍙][A-Za-zΔ∆⍙_¯0-9]*`, NameFunction, nil}, {`⎕[A-Za-zΔ∆⍙][A-Za-zΔ∆⍙_¯0-9]*`, NameFunction, nil},
{`[A-Za-zΔ∆⍙_][A-Za-zΔ∆⍙_¯0-9]*`, NameVariable, nil}, {`[A-Za-zΔ∆⍙][A-Za-zΔ∆⍙_¯0-9]*`, NameVariable, nil},
{`¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞)([Jj]¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞))?`, LiteralNumber, nil}, {`¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞)([Jj]¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞))?`, LiteralNumber, nil},
{`[\.\\/⌿⍀¨⍣⍨⍠⍤∘⍥@⌺⌶⍢]`, NameAttribute, nil}, {`[\.\\/⌿⍀¨⍣⍨⍠⍤∘]`, NameAttribute, nil},
{`[+\-×÷⌈⌊∣|?*⍟○!⌹<≤=>≥≠≡≢∊⍷∪∩~∨∧⍱⍲⍴,⍪⌽⊖⍉↑↓⊂⊃⌷⍋⍒⊤⊥⍕⍎⊣⊢⍁⍂≈⌸⍯↗⊆⍸]`, Operator, nil}, {`[+\-×÷⌈⌊∣|?*⍟○!⌹<≤=>≥≠≡≢∊⍷∪∩~∨∧⍱⍲⍴,⍪⌽⊖⍉↑↓⊂⊃⌷⍋⍒⊤⊥⍕⍎⊣⊢⍁⍂≈⌸⍯↗]`, Operator, nil},
{``, NameConstant, nil}, {``, NameConstant, nil},
{`[⎕⍞]`, NameVariableGlobal, nil}, {`[⎕⍞]`, NameVariableGlobal, nil},
{`[←→]`, KeywordDeclaration, nil}, {`[←→]`, KeywordDeclaration, nil},
{`[⍺⍵⍶⍹∇:]`, NameBuiltinPseudo, nil}, {`[⍺⍵⍶⍹∇:]`, NameBuiltinPseudo, nil},
{`[{}]`, KeywordType, nil}, {`[{}]`, KeywordType, nil},
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Applescript lexer. // Applescript lexer.
var Applescript = internal.Register(MustNewLazyLexer( var Applescript = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "AppleScript", Name: "AppleScript",
Aliases: []string{"applescript"}, Aliases: []string{"applescript"},
@ -14,11 +14,7 @@ var Applescript = internal.Register(MustNewLazyLexer(
MimeTypes: []string{}, MimeTypes: []string{},
DotAll: true, DotAll: true,
}, },
applescriptRules, Rules{
))
func applescriptRules() Rules {
return Rules{
"root": { "root": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`¬\n`, LiteralStringEscape, nil}, {`¬\n`, LiteralStringEscape, nil},
@ -55,5 +51,5 @@ func applescriptRules() Rules {
{`[^*(]+`, CommentMultiline, nil}, {`[^*(]+`, CommentMultiline, nil},
{`[*(]`, CommentMultiline, nil}, {`[*(]`, CommentMultiline, nil},
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Arduino lexer. // Arduino lexer.
var Arduino = internal.Register(MustNewLazyLexer( var Arduino = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Arduino", Name: "Arduino",
Aliases: []string{"arduino"}, Aliases: []string{"arduino"},
@ -14,11 +14,7 @@ var Arduino = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"text/x-arduino"}, MimeTypes: []string{"text/x-arduino"},
EnsureNL: true, EnsureNL: true,
}, },
arduinoRules, Rules{
))
func arduinoRules() Rules {
return Rules{
"statements": { "statements": {
{Words(``, `\b`, `catch`, `const_cast`, `delete`, `dynamic_cast`, `explicit`, `export`, `friend`, `mutable`, `namespace`, `new`, `operator`, `private`, `protected`, `public`, `reinterpret_cast`, `restrict`, `static_cast`, `template`, `this`, `throw`, `throws`, `try`, `typeid`, `typename`, `using`, `virtual`, `constexpr`, `nullptr`, `decltype`, `thread_local`, `alignas`, `alignof`, `static_assert`, `noexcept`, `override`, `final`), Keyword, nil}, {Words(``, `\b`, `catch`, `const_cast`, `delete`, `dynamic_cast`, `explicit`, `export`, `friend`, `mutable`, `namespace`, `new`, `operator`, `private`, `protected`, `public`, `reinterpret_cast`, `restrict`, `static_cast`, `template`, `this`, `throw`, `throws`, `try`, `typeid`, `typename`, `using`, `virtual`, `constexpr`, `nullptr`, `decltype`, `thread_local`, `alignas`, `alignof`, `static_assert`, `noexcept`, `override`, `final`), Keyword, nil},
{`char(16_t|32_t)\b`, KeywordType, nil}, {`char(16_t|32_t)\b`, KeywordType, nil},
@ -110,5 +106,5 @@ func arduinoRules() Rules {
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)}, {`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
{`.*?\n`, Comment, nil}, {`.*?\n`, Comment, nil},
}, },
} },
} ))

View File

@ -1,72 +0,0 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
var ArmAsm = internal.Register(MustNewLazyLexer(
&Config{
Name: "ArmAsm",
Aliases: []string{"armasm"},
EnsureNL: true,
Filenames: []string{"*.s", "*.S"},
MimeTypes: []string{"text/x-armasm", "text/x-asm"},
},
armasmRules,
))
func armasmRules() Rules {
return Rules{
"commentsandwhitespace": {
{`\s+`, Text, nil},
{`[@;].*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
},
"literal": {
// Binary
{`0b[01]+`, NumberBin, Pop(1)},
// Hex
{`0x\w{1,8}`, NumberHex, Pop(1)},
// Octal
{`0\d+`, NumberOct, Pop(1)},
// Float
{`\d+?\.\d+?`, NumberFloat, Pop(1)},
// Integer
{`\d+`, NumberInteger, Pop(1)},
// String
{`(")(.+)(")`, ByGroups(Punctuation, StringDouble, Punctuation), Pop(1)},
// Char
{`(')(.{1}|\\.{1})(')`, ByGroups(Punctuation, StringChar, Punctuation), Pop(1)},
},
"opcode": {
// Escape at line end
{`\n`, Text, Pop(1)},
// Comment
{`(@|;).*\n`, CommentSingle, Pop(1)},
// Whitespace
{`(\s+|,)`, Text, nil},
// Register by number
{`[rapcfxwbhsdqv]\d{1,2}`, NameClass, nil},
// Address by hex
{`=0x\w+`, ByGroups(Text, NameLabel), nil},
// Pseudo address by label
{`(=)(\w+)`, ByGroups(Text, NameLabel), nil},
// Immediate
{`#`, Text, Push("literal")},
},
"root": {
Include("commentsandwhitespace"),
// Directive with optional param
{`(\.\w+)([ \t]+\w+\s+?)?`, ByGroups(KeywordNamespace, NameLabel), nil},
// Label with data
{`(\w+)(:)(\s+\.\w+\s+)`, ByGroups(NameLabel, Punctuation, KeywordNamespace), Push("literal")},
// Label
{`(\w+)(:)`, ByGroups(NameLabel, Punctuation), nil},
// Syscall Op
{`svc\s+\w+`, NameNamespace, nil},
// Opcode
{`[a-zA-Z]+`, Text, Push("opcode")},
},
}
}

View File

@ -6,18 +6,14 @@ import (
) )
// Awk lexer. // Awk lexer.
var Awk = internal.Register(MustNewLazyLexer( var Awk = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Awk", Name: "Awk",
Aliases: []string{"awk", "gawk", "mawk", "nawk"}, Aliases: []string{"awk", "gawk", "mawk", "nawk"},
Filenames: []string{"*.awk"}, Filenames: []string{"*.awk"},
MimeTypes: []string{"application/x-awk"}, MimeTypes: []string{"application/x-awk"},
}, },
awkRules, Rules{
))
func awkRules() Rules {
return Rules{
"commentsandwhitespace": { "commentsandwhitespace": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
{`#.*$`, CommentSingle, nil}, {`#.*$`, CommentSingle, nil},
@ -34,19 +30,19 @@ func awkRules() Rules {
"root": { "root": {
{`^(?=\s|/)`, Text, Push("slashstartsregex")}, {`^(?=\s|/)`, Text, Push("slashstartsregex")},
Include("commentsandwhitespace"), Include("commentsandwhitespace"),
{`\+\+|--|\|\||&&|in\b|\$|!?~|\|&|(\*\*|[-<>+*%\^/!=|])=?`, Operator, Push("slashstartsregex")}, {`\+\+|--|\|\||&&|in\b|\$|!?~|(\*\*|[-<>+*%\^/!=|])=?`, Operator, Push("slashstartsregex")},
{`[{(\[;,]`, Punctuation, Push("slashstartsregex")}, {`[{(\[;,]`, Punctuation, Push("slashstartsregex")},
{`[})\].]`, Punctuation, nil}, {`[})\].]`, Punctuation, nil},
{`(break|continue|do|while|exit|for|if|else|return|switch|case|default)\b`, Keyword, Push("slashstartsregex")}, {`(break|continue|do|while|exit|for|if|else|return)\b`, Keyword, Push("slashstartsregex")},
{`function\b`, KeywordDeclaration, Push("slashstartsregex")}, {`function\b`, KeywordDeclaration, Push("slashstartsregex")},
{`(atan2|cos|exp|int|log|rand|sin|sqrt|srand|gensub|gsub|index|length|match|split|patsplit|sprintf|sub|substr|tolower|toupper|close|fflush|getline|next(file)|print|printf|strftime|systime|mktime|delete|system|strtonum|and|compl|lshift|or|rshift|asorti?|isarray|bindtextdomain|dcn?gettext|@(include|load|namespace))\b`, KeywordReserved, nil}, {`(atan2|cos|exp|int|log|rand|sin|sqrt|srand|gensub|gsub|index|length|match|split|sprintf|sub|substr|tolower|toupper|close|fflush|getline|next|nextfile|print|printf|strftime|systime|delete|system)\b`, KeywordReserved, nil},
{`(ARGC|ARGIND|ARGV|BEGIN(FILE)?|BINMODE|CONVFMT|ENVIRON|END(FILE)?|ERRNO|FIELDWIDTHS|FILENAME|FNR|FPAT|FS|IGNORECASE|LINT|NF|NR|OFMT|OFS|ORS|PROCINFO|RLENGTH|RS|RSTART|RT|SUBSEP|TEXTDOMAIN)\b`, NameBuiltin, nil}, {`(ARGC|ARGIND|ARGV|BEGIN|CONVFMT|ENVIRON|END|ERRNO|FIELDWIDTHS|FILENAME|FNR|FS|IGNORECASE|NF|NR|OFMT|OFS|ORFS|RLENGTH|RS|RSTART|RT|SUBSEP)\b`, NameBuiltin, nil},
{`[@$a-zA-Z_]\w*`, NameOther, nil}, {`[$a-zA-Z_]\w*`, NameOther, nil},
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil}, {`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
{`0x[0-9a-fA-F]+`, LiteralNumberHex, nil}, {`0x[0-9a-fA-F]+`, LiteralNumberHex, nil},
{`[0-9]+`, LiteralNumberInteger, nil}, {`[0-9]+`, LiteralNumberInteger, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil}, {`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil}, {`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Ballerina lexer. // Ballerina lexer.
var Ballerina = internal.Register(MustNewLazyLexer( var Ballerina = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Ballerina", Name: "Ballerina",
Aliases: []string{"ballerina"}, Aliases: []string{"ballerina"},
@ -14,11 +14,7 @@ var Ballerina = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"text/x-ballerina"}, MimeTypes: []string{"text/x-ballerina"},
DotAll: true, DotAll: true,
}, },
ballerinaRules, Rules{
))
func ballerinaRules() Rules {
return Rules{
"root": { "root": {
{`[^\S\n]+`, Text, nil}, {`[^\S\n]+`, Text, nil},
{`//.*?\n`, CommentSingle, nil}, {`//.*?\n`, CommentSingle, nil},
@ -46,5 +42,5 @@ func ballerinaRules() Rules {
"import": { "import": {
{`[\w.]+`, NameNamespace, Pop(1)}, {`[\w.]+`, NameNamespace, Pop(1)},
}, },
} },
} ))

View File

@ -7,27 +7,17 @@ import (
"github.com/alecthomas/chroma/lexers/internal" "github.com/alecthomas/chroma/lexers/internal"
) )
// TODO(moorereason): can this be factored away?
var bashAnalyserRe = regexp.MustCompile(`(?m)^#!.*/bin/(?:env |)(?:bash|zsh|sh|ksh)`) var bashAnalyserRe = regexp.MustCompile(`(?m)^#!.*/bin/(?:env |)(?:bash|zsh|sh|ksh)`)
// Bash lexer. // Bash lexer.
var Bash = internal.Register(MustNewLazyLexer( var Bash = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Bash", Name: "Bash",
Aliases: []string{"bash", "sh", "ksh", "zsh", "shell"}, Aliases: []string{"bash", "sh", "ksh", "zsh", "shell"},
Filenames: []string{"*.sh", "*.ksh", "*.bash", "*.ebuild", "*.eclass", ".env", "*.env", "*.exheres-0", "*.exlib", "*.zsh", "*.zshrc", ".bashrc", "bashrc", ".bash_*", "bash_*", "zshrc", ".zshrc", "PKGBUILD"}, Filenames: []string{"*.sh", "*.ksh", "*.bash", "*.ebuild", "*.eclass", "*.exheres-0", "*.exlib", "*.zsh", "*.zshrc", ".bashrc", "bashrc", ".bash_*", "bash_*", "zshrc", ".zshrc", "PKGBUILD"},
MimeTypes: []string{"application/x-sh", "application/x-shellscript"}, MimeTypes: []string{"application/x-sh", "application/x-shellscript"},
}, },
bashRules, Rules{
).SetAnalyser(func(text string) float32 {
if bashAnalyserRe.FindString(text) != "" {
return 1.0
}
return 0.0
}))
func bashRules() Rules {
return Rules{
"root": { "root": {
Include("basic"), Include("basic"),
{"`", LiteralStringBacktick, Push("backticks")}, {"`", LiteralStringBacktick, Push("backticks")},
@ -96,5 +86,10 @@ func bashRules() Rules {
{"`", LiteralStringBacktick, Pop(1)}, {"`", LiteralStringBacktick, Pop(1)},
Include("root"), Include("root"),
}, },
},
).SetAnalyser(func(text string) float32 {
if bashAnalyserRe.FindString(text) != "" {
return 1.0
} }
} return 0.0
}))

View File

@ -1,27 +0,0 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// BashSession lexer.
var BashSession = internal.Register(MustNewLazyLexer(
&Config{
Name: "BashSession",
Aliases: []string{"bash-session", "console", "shell-session"},
Filenames: []string{".sh-session"},
MimeTypes: []string{"text/x-sh"},
EnsureNL: true,
},
bashsessionRules,
))
func bashsessionRules() Rules {
return Rules{
"root": {
{`^((?:\[[^]]+@[^]]+\]\s?)?[#$%>])(\s*)(.*\n?)`, ByGroups(GenericPrompt, Text, Using(Bash)), nil},
{`^.+\n?`, GenericOutput, nil},
},
}
}

View File

@ -6,7 +6,7 @@ import (
) )
// Batchfile lexer. // Batchfile lexer.
var Batchfile = internal.Register(MustNewLazyLexer( var Batchfile = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Batchfile", Name: "Batchfile",
Aliases: []string{"bat", "batch", "dosbatch", "winbatch"}, Aliases: []string{"bat", "batch", "dosbatch", "winbatch"},
@ -14,11 +14,7 @@ var Batchfile = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"application/x-dos-batch"}, MimeTypes: []string{"application/x-dos-batch"},
CaseInsensitive: true, CaseInsensitive: true,
}, },
batchfileRules, Rules{
))
func batchfileRules() Rules {
return Rules{
"root": { "root": {
{`\)((?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:[^\n\x1a^]|\^[\n\x1a]?[\w\W])*)`, CommentSingle, nil}, {`\)((?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:[^\n\x1a^]|\^[\n\x1a]?[\w\W])*)`, CommentSingle, nil},
{`(?=((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:))`, Text, Push("follow")}, {`(?=((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:))`, Text, Push("follow")},
@ -194,5 +190,5 @@ func batchfileRules() Rules {
{`else(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])`, Keyword, Pop(1)}, {`else(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])`, Keyword, Pop(1)},
Default(Pop(1)), Default(Pop(1)),
}, },
} },
} ))

View File

@ -6,7 +6,7 @@ import (
) )
// Bibtex lexer. // Bibtex lexer.
var Bibtex = internal.Register(MustNewLazyLexer( var Bibtex = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "BibTeX", Name: "BibTeX",
Aliases: []string{"bib", "bibtex"}, Aliases: []string{"bib", "bibtex"},
@ -15,11 +15,7 @@ var Bibtex = internal.Register(MustNewLazyLexer(
NotMultiline: true, NotMultiline: true,
CaseInsensitive: true, CaseInsensitive: true,
}, },
bibtexRules, Rules{
))
func bibtexRules() Rules {
return Rules{
"root": { "root": {
Include("whitespace"), Include("whitespace"),
{`@comment`, Comment, nil}, {`@comment`, Comment, nil},
@ -76,5 +72,5 @@ func bibtexRules() Rules {
"whitespace": { "whitespace": {
{`\s+`, Text, nil}, {`\s+`, Text, nil},
}, },
} },
} ))

View File

@ -1,112 +0,0 @@
package b
import (
"strings"
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Bicep lexer.
var Bicep = internal.Register(MustNewLazyLexer(
&Config{
Name: "Bicep",
Aliases: []string{"bicep"},
Filenames: []string{"*.bicep"},
},
bicepRules,
))
func bicepRules() Rules {
bicepFunctions := []string{
"any",
"array",
"concat",
"contains",
"empty",
"first",
"intersection",
"items",
"last",
"length",
"min",
"max",
"range",
"skip",
"take",
"union",
"dateTimeAdd",
"utcNow",
"deployment",
"environment",
"loadFileAsBase64",
"loadTextContent",
"int",
"json",
"extensionResourceId",
"getSecret",
"list",
"listKeys",
"listKeyValue",
"listAccountSas",
"listSecrets",
"pickZones",
"reference",
"resourceId",
"subscriptionResourceId",
"tenantResourceId",
"managementGroup",
"resourceGroup",
"subscription",
"tenant",
"base64",
"base64ToJson",
"base64ToString",
"dataUri",
"dataUriToString",
"endsWith",
"format",
"guid",
"indexOf",
"lastIndexOf",
"length",
"newGuid",
"padLeft",
"replace",
"split",
"startsWith",
"string",
"substring",
"toLower",
"toUpper",
"trim",
"uniqueString",
"uri",
"uriComponent",
"uriComponentToString",
}
return Rules{
"root": {
{`//[^\n\r]+`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`([']?\w+[']?)(:)`, ByGroups(NameProperty, Punctuation), nil},
{`\b('(resourceGroup|subscription|managementGroup|tenant)')\b`, KeywordNamespace, nil},
{`'[\w\$\{\(\)\}\.]{1,}?'`, LiteralStringInterpol, nil},
{`('''|').*?('''|')`, LiteralString, nil},
{`\b(allowed|batchSize|description|maxLength|maxValue|metadata|minLength|minValue|secure)\b`, NameDecorator, nil},
{`\b(az|sys)\.`, NameNamespace, nil},
{`\b(` + strings.Join(bicepFunctions, "|") + `)\b`, NameFunction, nil},
// https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-logical
{`\b(bool)(\()`, ByGroups(NameFunction, Punctuation), nil},
{`\b(for|if|in)\b`, Keyword, nil},
{`\b(module|output|param|resource|var)\b`, KeywordDeclaration, nil},
{`\b(array|bool|int|object|string)\b`, KeywordType, nil},
// https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/operators
{`(>=|>|<=|<|==|!=|=~|!~|::|&&|\?\?|!|-|%|\*|\/|\+)`, Operator, nil},
{`[\(\)\[\]\.:\?{}@=]`, Punctuation, nil},
{`[\w_-]+`, Text, nil},
{`\s+`, TextWhitespace, nil},
},
}
}

View File

@ -6,7 +6,7 @@ import (
) )
// Blitzbasic lexer. // Blitzbasic lexer.
var Blitzbasic = internal.Register(MustNewLazyLexer( var Blitzbasic = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "BlitzBasic", Name: "BlitzBasic",
Aliases: []string{"blitzbasic", "b3d", "bplus"}, Aliases: []string{"blitzbasic", "b3d", "bplus"},
@ -14,11 +14,7 @@ var Blitzbasic = internal.Register(MustNewLazyLexer(
MimeTypes: []string{"text/x-bb"}, MimeTypes: []string{"text/x-bb"},
CaseInsensitive: true, CaseInsensitive: true,
}, },
blitzbasicRules, Rules{
))
func blitzbasicRules() Rules {
return Rules{
"root": { "root": {
{`[ \t]+`, Text, nil}, {`[ \t]+`, Text, nil},
{`;.*?\n`, CommentSingle, nil}, {`;.*?\n`, CommentSingle, nil},
@ -48,5 +44,5 @@ func blitzbasicRules() Rules {
{`"C?`, LiteralStringDouble, Pop(1)}, {`"C?`, LiteralStringDouble, Pop(1)},
{`[^"]+`, LiteralStringDouble, nil}, {`[^"]+`, LiteralStringDouble, nil},
}, },
} },
} ))

View File

@ -6,23 +6,19 @@ import (
) )
// Bnf lexer. // Bnf lexer.
var Bnf = internal.Register(MustNewLazyLexer( var Bnf = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "BNF", Name: "BNF",
Aliases: []string{"bnf"}, Aliases: []string{"bnf"},
Filenames: []string{"*.bnf"}, Filenames: []string{"*.bnf"},
MimeTypes: []string{"text/x-bnf"}, MimeTypes: []string{"text/x-bnf"},
}, },
bnfRules, Rules{
))
func bnfRules() Rules {
return Rules{
"root": { "root": {
{`(<)([ -;=?-~]+)(>)`, ByGroups(Punctuation, NameClass, Punctuation), nil}, {`(<)([ -;=?-~]+)(>)`, ByGroups(Punctuation, NameClass, Punctuation), nil},
{`::=`, Operator, nil}, {`::=`, Operator, nil},
{`[^<>:]+`, Text, nil}, {`[^<>:]+`, Text, nil},
{`.`, Text, nil}, {`.`, Text, nil},
}, },
} },
} ))

View File

@ -6,18 +6,14 @@ import (
) )
// Brainfuck lexer. // Brainfuck lexer.
var Brainfuck = internal.Register(MustNewLazyLexer( var Brainfuck = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Brainfuck", Name: "Brainfuck",
Aliases: []string{"brainfuck", "bf"}, Aliases: []string{"brainfuck", "bf"},
Filenames: []string{"*.bf", "*.b"}, Filenames: []string{"*.bf", "*.b"},
MimeTypes: []string{"application/x-brainfuck"}, MimeTypes: []string{"application/x-brainfuck"},
}, },
brainfuckRules, Rules{
))
func brainfuckRules() Rules {
return Rules{
"common": { "common": {
{`[.,]+`, NameTag, nil}, {`[.,]+`, NameTag, nil},
{`[+-]+`, NameBuiltin, nil}, {`[+-]+`, NameBuiltin, nil},
@ -34,5 +30,5 @@ func brainfuckRules() Rules {
{`\]`, Keyword, Pop(1)}, {`\]`, Keyword, Pop(1)},
Include("common"), Include("common"),
}, },
} },
} ))

View File

@ -6,19 +6,14 @@ import (
) )
// C lexer. // C lexer.
var C = internal.Register(MustNewLazyLexer( var C = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "C", Name: "C",
Aliases: []string{"c"}, Aliases: []string{"c"},
Filenames: []string{"*.c", "*.h", "*.idc", "*.x[bp]m"}, Filenames: []string{"*.c", "*.h", "*.idc"},
MimeTypes: []string{"text/x-chdr", "text/x-csrc", "image/x-xbitmap", "image/x-xpixmap"}, MimeTypes: []string{"text/x-chdr", "text/x-csrc"},
EnsureNL: true,
}, },
cRules, Rules{
))
func cRules() Rules {
return Rules{
"whitespace": { "whitespace": {
{`^#if\s+0`, CommentPreproc, Push("if0")}, {`^#if\s+0`, CommentPreproc, Push("if0")},
{`^#`, CommentPreproc, Push("macro")}, {`^#`, CommentPreproc, Push("macro")},
@ -43,7 +38,7 @@ func cRules() Rules {
{`[~!%^&*+=|?:<>/-]`, Operator, nil}, {`[~!%^&*+=|?:<>/-]`, Operator, nil},
{`[()\[\],.]`, Punctuation, nil}, {`[()\[\],.]`, Punctuation, nil},
{Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil}, {Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil},
{`(bool|int|long|float|short|double|char((8|16|32)_t)?|unsigned|signed|void|u?int(_fast|_least|)(8|16|32|64)_t)\b`, KeywordType, nil}, {`(bool|int|long|float|short|double|char|unsigned|signed|void)\b`, KeywordType, nil},
{Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil}, {Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil},
{`(__m(128i|128d|128|64))\b`, KeywordReserved, nil}, {`(__m(128i|128d|128|64))\b`, KeywordReserved, nil},
{Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `wchar_t`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil}, {Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `wchar_t`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil},
@ -92,5 +87,5 @@ func cRules() Rules {
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)}, {`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
{`.*?\n`, Comment, nil}, {`.*?\n`, Comment, nil},
}, },
} },
} ))

View File

@ -6,149 +6,143 @@ import (
) )
// caddyfileCommon are the rules common to both of the lexer variants // caddyfileCommon are the rules common to both of the lexer variants
func caddyfileCommonRules() Rules { var caddyfileCommon = Rules{
return Rules{ "site_block_common": {
"site_block_common": { // Import keyword
// Import keyword {`(import)(\s+)([^\s]+)`, ByGroups(Keyword, Text, NameVariableMagic), nil},
{`(import)(\s+)([^\s]+)`, ByGroups(Keyword, Text, NameVariableMagic), nil}, // Matcher definition
// Matcher definition {`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")},
{`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")}, // Matcher token stub for docs
// Matcher token stub for docs {`\[\<matcher\>\]`, NameDecorator, Push("matcher")},
{`\[\<matcher\>\]`, NameDecorator, Push("matcher")}, // These cannot have matchers but may have things that look like
// These cannot have matchers but may have things that look like // matchers in their arguments, so we just parse as a subdirective.
// matchers in their arguments, so we just parse as a subdirective. {`try_files`, Keyword, Push("subdirective")},
{`try_files`, Keyword, Push("subdirective")}, // These are special, they can nest more directives
// These are special, they can nest more directives {`handle_errors|handle|route|handle_path|not`, Keyword, Push("nested_directive")},
{`handle_errors|handle|route|handle_path|not`, Keyword, Push("nested_directive")}, // Any other directive
// Any other directive {`[^\s#]+`, Keyword, Push("directive")},
{`[^\s#]+`, Keyword, Push("directive")}, Include("base"),
Include("base"), },
}, "matcher": {
"matcher": { {`\{`, Punctuation, Push("block")},
{`\{`, Punctuation, Push("block")}, // Not can be one-liner
// Not can be one-liner {`not`, Keyword, Push("deep_not_matcher")},
{`not`, Keyword, Push("deep_not_matcher")}, // Any other same-line matcher
// Any other same-line matcher {`[^\s#]+`, Keyword, Push("arguments")},
{`[^\s#]+`, Keyword, Push("arguments")}, // Terminators
// Terminators {`\n`, Text, Pop(1)},
{`\n`, Text, Pop(1)}, {`\}`, Punctuation, Pop(1)},
{`\}`, Punctuation, Pop(1)}, Include("base"),
Include("base"), },
}, "block": {
"block": { {`\}`, Punctuation, Pop(2)},
{`\}`, Punctuation, Pop(2)}, // Not can be one-liner
// Not can be one-liner {`not`, Keyword, Push("not_matcher")},
{`not`, Keyword, Push("not_matcher")}, // Any other subdirective
// Any other subdirective {`[^\s#]+`, Keyword, Push("subdirective")},
{`[^\s#]+`, Keyword, Push("subdirective")}, Include("base"),
Include("base"), },
}, "nested_block": {
"nested_block": { {`\}`, Punctuation, Pop(2)},
{`\}`, Punctuation, Pop(2)}, // Matcher definition
// Matcher definition {`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")},
{`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")}, // Something that starts with literally < is probably a docs stub
// Something that starts with literally < is probably a docs stub {`\<[^#]+\>`, Keyword, Push("nested_directive")},
{`\<[^#]+\>`, Keyword, Push("nested_directive")}, // Any other directive
// Any other directive {`[^\s#]+`, Keyword, Push("nested_directive")},
{`[^\s#]+`, Keyword, Push("nested_directive")}, Include("base"),
Include("base"), },
}, "not_matcher": {
"not_matcher": { {`\}`, Punctuation, Pop(2)},
{`\}`, Punctuation, Pop(2)}, {`\{(?=\s)`, Punctuation, Push("block")},
{`\{(?=\s)`, Punctuation, Push("block")}, {`[^\s#]+`, Keyword, Push("arguments")},
{`[^\s#]+`, Keyword, Push("arguments")}, {`\s+`, Text, nil},
{`\s+`, Text, nil}, },
}, "deep_not_matcher": {
"deep_not_matcher": { {`\}`, Punctuation, Pop(2)},
{`\}`, Punctuation, Pop(2)}, {`\{(?=\s)`, Punctuation, Push("block")},
{`\{(?=\s)`, Punctuation, Push("block")}, {`[^\s#]+`, Keyword, Push("deep_subdirective")},
{`[^\s#]+`, Keyword, Push("deep_subdirective")}, {`\s+`, Text, nil},
{`\s+`, Text, nil}, },
}, "directive": {
"directive": { {`\{(?=\s)`, Punctuation, Push("block")},
{`\{(?=\s)`, Punctuation, Push("block")}, Include("matcher_token"),
Include("matcher_token"), Include("comments_pop_1"),
Include("comments_pop_1"), {`\n`, Text, Pop(1)},
{`\n`, Text, Pop(1)}, Include("base"),
Include("base"), },
}, "nested_directive": {
"nested_directive": { {`\{(?=\s)`, Punctuation, Push("nested_block")},
{`\{(?=\s)`, Punctuation, Push("nested_block")}, Include("matcher_token"),
Include("matcher_token"), Include("comments_pop_1"),
Include("comments_pop_1"), {`\n`, Text, Pop(1)},
{`\n`, Text, Pop(1)}, Include("base"),
Include("base"), },
}, "subdirective": {
"subdirective": { {`\{(?=\s)`, Punctuation, Push("block")},
{`\{(?=\s)`, Punctuation, Push("block")}, Include("comments_pop_1"),
Include("comments_pop_1"), {`\n`, Text, Pop(1)},
{`\n`, Text, Pop(1)}, Include("base"),
Include("base"), },
}, "arguments": {
"arguments": { {`\{(?=\s)`, Punctuation, Push("block")},
{`\{(?=\s)`, Punctuation, Push("block")}, Include("comments_pop_2"),
Include("comments_pop_2"), {`\\\n`, Text, nil}, // Skip escaped newlines
{`\\\n`, Text, nil}, // Skip escaped newlines {`\n`, Text, Pop(2)},
{`\n`, Text, Pop(2)}, Include("base"),
Include("base"), },
}, "deep_subdirective": {
"deep_subdirective": { {`\{(?=\s)`, Punctuation, Push("block")},
{`\{(?=\s)`, Punctuation, Push("block")}, Include("comments_pop_3"),
Include("comments_pop_3"), {`\n`, Text, Pop(3)},
{`\n`, Text, Pop(3)}, Include("base"),
Include("base"), },
}, "matcher_token": {
"matcher_token": { {`@[^\s]+`, NameDecorator, Push("arguments")}, // Named matcher
{`@[^\s]+`, NameDecorator, Push("arguments")}, // Named matcher {`/[^\s]+`, NameDecorator, Push("arguments")}, // Path matcher
{`/[^\s]+`, NameDecorator, Push("arguments")}, // Path matcher {`\*`, NameDecorator, Push("arguments")}, // Wildcard path matcher
{`\*`, NameDecorator, Push("arguments")}, // Wildcard path matcher {`\[\<matcher\>\]`, NameDecorator, Push("arguments")}, // Matcher token stub for docs
{`\[\<matcher\>\]`, NameDecorator, Push("arguments")}, // Matcher token stub for docs },
}, "comments": {
"comments": { {`^#.*\n`, CommentSingle, nil}, // Comment at start of line
{`^#.*\n`, CommentSingle, nil}, // Comment at start of line {`\s+#.*\n`, CommentSingle, nil}, // Comment preceded by whitespace
{`\s+#.*\n`, CommentSingle, nil}, // Comment preceded by whitespace },
}, "comments_pop_1": {
"comments_pop_1": { {`^#.*\n`, CommentSingle, Pop(1)}, // Comment at start of line
{`^#.*\n`, CommentSingle, Pop(1)}, // Comment at start of line {`\s+#.*\n`, CommentSingle, Pop(1)}, // Comment preceded by whitespace
{`\s+#.*\n`, CommentSingle, Pop(1)}, // Comment preceded by whitespace },
}, "comments_pop_2": {
"comments_pop_2": { {`^#.*\n`, CommentSingle, Pop(2)}, // Comment at start of line
{`^#.*\n`, CommentSingle, Pop(2)}, // Comment at start of line {`\s+#.*\n`, CommentSingle, Pop(2)}, // Comment preceded by whitespace
{`\s+#.*\n`, CommentSingle, Pop(2)}, // Comment preceded by whitespace },
}, "comments_pop_3": {
"comments_pop_3": { {`^#.*\n`, CommentSingle, Pop(3)}, // Comment at start of line
{`^#.*\n`, CommentSingle, Pop(3)}, // Comment at start of line {`\s+#.*\n`, CommentSingle, Pop(3)}, // Comment preceded by whitespace
{`\s+#.*\n`, CommentSingle, Pop(3)}, // Comment preceded by whitespace },
}, "base": {
"base": { Include("comments"),
Include("comments"), {`(on|off|first|last|before|after|internal|strip_prefix|strip_suffix|replace)\b`, NameConstant, nil},
{`(on|off|first|last|before|after|internal|strip_prefix|strip_suffix|replace)\b`, NameConstant, nil}, {`(https?://)?([a-z0-9.-]+)(:)([0-9]+)`, ByGroups(Name, Name, Punctuation, LiteralNumberInteger), nil},
{`(https?://)?([a-z0-9.-]+)(:)([0-9]+)`, ByGroups(Name, Name, Punctuation, LiteralNumberInteger), nil}, {`[a-z-]+/[a-z-+]+`, LiteralString, nil},
{`[a-z-]+/[a-z-+]+`, LiteralString, nil}, {`[0-9]+[km]?\b`, LiteralNumberInteger, nil},
{`[0-9]+[km]?\b`, LiteralNumberInteger, nil}, {`\{[\w+.\$-]+\}`, LiteralStringEscape, nil}, // Placeholder
{`\{[\w+.\$-]+\}`, LiteralStringEscape, nil}, // Placeholder {`\[(?=[^#{}$]+\])`, Punctuation, nil},
{`\[(?=[^#{}$]+\])`, Punctuation, nil}, {`\]|\|`, Punctuation, nil},
{`\]|\|`, Punctuation, nil}, {`[^\s#{}$\]]+`, LiteralString, nil},
{`[^\s#{}$\]]+`, LiteralString, nil}, {`/[^\s#]*`, Name, nil},
{`/[^\s#]*`, Name, nil}, {`\s+`, Text, nil},
{`\s+`, Text, nil}, },
},
}
} }
// Caddyfile lexer. // Caddyfile lexer.
var Caddyfile = internal.Register(MustNewLazyLexer( var Caddyfile = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Caddyfile", Name: "Caddyfile",
Aliases: []string{"caddyfile", "caddy"}, Aliases: []string{"caddyfile", "caddy"},
Filenames: []string{"Caddyfile*"}, Filenames: []string{"Caddyfile*"},
MimeTypes: []string{}, MimeTypes: []string{},
}, },
caddyfileRules, Rules{
))
func caddyfileRules() Rules {
return Rules{
"root": { "root": {
Include("comments"), Include("comments"),
// Global options block // Global options block
@ -192,25 +186,21 @@ func caddyfileRules() Rules {
{`\}`, Punctuation, Pop(2)}, {`\}`, Punctuation, Pop(2)},
Include("site_block_common"), Include("site_block_common"),
}, },
}.Merge(caddyfileCommonRules()) }.Merge(caddyfileCommon),
} ))
// Caddyfile directive-only lexer. // Caddyfile directive-only lexer.
var CaddyfileDirectives = internal.Register(MustNewLazyLexer( var CaddyfileDirectives = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Caddyfile Directives", Name: "Caddyfile Directives",
Aliases: []string{"caddyfile-directives", "caddyfile-d", "caddy-d"}, Aliases: []string{"caddyfile-directives", "caddyfile-d", "caddy-d"},
Filenames: []string{}, Filenames: []string{},
MimeTypes: []string{}, MimeTypes: []string{},
}, },
caddyfileDirectivesRules, Rules{
))
func caddyfileDirectivesRules() Rules {
return Rules{
// Same as "site_block" in Caddyfile // Same as "site_block" in Caddyfile
"root": { "root": {
Include("site_block_common"), Include("site_block_common"),
}, },
}.Merge(caddyfileCommonRules()) }.Merge(caddyfileCommon),
} ))

View File

@ -6,18 +6,14 @@ import (
) )
// Cap'N'Proto Proto lexer. // Cap'N'Proto Proto lexer.
var CapNProto = internal.Register(MustNewLazyLexer( var CapNProto = internal.Register(MustNewLexer(
&Config{ &Config{
Name: "Cap'n Proto", Name: "Cap'n Proto",
Aliases: []string{"capnp"}, Aliases: []string{"capnp"},
Filenames: []string{"*.capnp"}, Filenames: []string{"*.capnp"},
MimeTypes: []string{}, MimeTypes: []string{},
}, },
capNProtoRules, Rules{
))
func capNProtoRules() Rules {
return Rules{
"root": { "root": {
{`#.*?$`, CommentSingle, nil}, {`#.*?$`, CommentSingle, nil},
{`@[0-9a-zA-Z]*`, NameDecorator, nil}, {`@[0-9a-zA-Z]*`, NameDecorator, nil},
@ -61,5 +57,5 @@ func capNProtoRules() Rules {
{`[])]`, NameAttribute, Pop(1)}, {`[])]`, NameAttribute, Pop(1)},
Default(Pop(1)), Default(Pop(1)),
}, },
} },
} ))

Some files were not shown because too many files have changed in this diff Show More