mirror of
https://github.com/cheat/cheat.git
synced 2025-09-01 09:38:29 +02:00
Compare commits
190 Commits
Author | SHA1 | Date | |
---|---|---|---|
768d55e5d4 | |||
6aedc5c116 | |||
e881bb1f97 | |||
501f9c66ad | |||
a2aa82d9f3 | |||
018bce7ad5 | |||
17acefdd9b | |||
37918e09a4 | |||
86967873a8 | |||
d237d98c15 | |||
eb9b3e7798 | |||
b0a351033d | |||
1eb44e8809 | |||
55b18b4897 | |||
883a17092f | |||
4f2a57fce8 | |||
ecc96c64f9 | |||
a81dd96ff4 | |||
fb538baba5 | |||
1a7b5c6127 | |||
cdddfbb516 | |||
4ef4c35d8c | |||
a58294859e | |||
606092e288 | |||
233a9de1aa | |||
aa16f68620 | |||
367673d5d9 | |||
08fb9e11a9 | |||
3f4d4bddb2 | |||
6c6753b35c | |||
0718b606e1 | |||
857119b443 | |||
f421483eea | |||
4adddbf504 | |||
b9c86b6975 | |||
0b21ccf6f8 | |||
a3ad8c5101 | |||
bacb74929a | |||
82e1c27494 | |||
45beeb2edb | |||
c2c479b36c | |||
cb0243e7fc | |||
e5d04d41ea | |||
2474ea4fb1 | |||
7467c9fbc0 | |||
dfba3da003 | |||
ad7ad64a75 | |||
c4dcfd5da0 | |||
278a5d9154 | |||
9fa0c466fd | |||
4e9b2928b3 | |||
fa5eb44be8 | |||
49afd7c16b | |||
59d5c96c24 | |||
8e602b0e93 | |||
fb04cb1fcd | |||
d42726101e | |||
93b3a711f5 | |||
9c3d41c8bd | |||
4eeec6c868 | |||
1b17ab1914 | |||
477650ee44 | |||
c4dd3b52fd | |||
e8a0ea0dc3 | |||
992ee66a56 | |||
c9840c2d6f | |||
bd53768f67 | |||
8092687956 | |||
16ade50672 | |||
62c80d76eb | |||
3e67eaa3b7 | |||
38b13655fe | |||
749d5c1182 | |||
521f83377c | |||
b15ff10537 | |||
5288bd0c1c | |||
bddbee4158 | |||
ce27cf2cc0 | |||
5733b1d6d4 | |||
2d221050d8 | |||
ce37b670c7 | |||
47a9eeb4fd | |||
be56c9cf0c | |||
7be57cb01c | |||
8453af8601 | |||
6e388c3693 | |||
b13246978a | |||
a39d36cd34 | |||
87cba04ff2 | |||
bc623da74b | |||
a6c25d4b9c | |||
e24ac2b385 | |||
e0c35a74d4 | |||
3e4c1818a9 | |||
7b4a268ebd | |||
f7183aa17a | |||
1ce6c29e6a | |||
219db679e1 | |||
53177cb09d | |||
ef7a41f9a9 | |||
008316d030 | |||
a59c019642 | |||
57225442be | |||
2c7ce48859 | |||
a3fe4f40bb | |||
506fb8be15 | |||
408e944eea | |||
8a313b92ca | |||
6912771c39 | |||
d4c6200702 | |||
9251849d23 | |||
313b5ebd27 | |||
ca91b25b02 | |||
bbf6af50b1 | |||
9f05442bce | |||
3fc4c2f89e | |||
9e88ff2642 | |||
e3764b81e7 | |||
3786ac96a5 | |||
4cb7a3b42c | |||
ff6a866abe | |||
2e7ccb2a68 | |||
126231db1f | |||
91f0d02de2 | |||
815e714fb4 | |||
bd3986a051 | |||
f47b75edc0 | |||
180ee20f77 | |||
9b86c583f8 | |||
e1f7828869 | |||
7f3ae2ab30 | |||
bbd03a1bb8 | |||
326c54147b | |||
efcedaedec | |||
301cbefb0c | |||
9a481f7e75 | |||
e2920bd922 | |||
973a1f59ea | |||
3afea0972c | |||
f86633ca1c | |||
a01a3491a4 | |||
daa43d3867 | |||
e94a1e22df | |||
5046975a0f | |||
198156a299 | |||
a7067279df | |||
a8e6fdb18a | |||
bbfa4efdb7 | |||
741ad91389 | |||
573d43a7e6 | |||
879e8f2be4 | |||
eab3c14f1f | |||
9a6130b6b7 | |||
aeaf01e1de | |||
09c29a322f | |||
0525b2331b | |||
27a4991a3a | |||
4dda412dcb | |||
bfb60764ad | |||
3a97c680bb | |||
edc0fe41ef | |||
9e49bf8e9c | |||
50dc3c8b29 | |||
e08a4f3cec | |||
d6ebe0799d | |||
51aaaf3423 | |||
197ff58796 | |||
b8f512aae8 | |||
e7a1a296e3 | |||
f7c093bec0 | |||
c47b7f81aa | |||
52081b97ac | |||
1dda796e7c | |||
67469b0afa | |||
4f8431a600 | |||
5301442f7c | |||
cd45efcdec | |||
c31786fc5b | |||
934c36ad77 | |||
2c0099c28a | |||
33ac3d34d1 | |||
10af84dc10 | |||
8d532bcdee | |||
a13ad99241 | |||
1a8345f326 | |||
472e1f84f5 | |||
659e0a8eff | |||
749173f1f6 | |||
33e33dc7b7 | |||
201cd1d629 |
23
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
23
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@ -0,0 +1,23 @@
|
||||
---
|
||||
name: Bug report
|
||||
about: Submit a bug report
|
||||
title: ''
|
||||
labels: 'bug'
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
Thanks for submitting a bug report. Please provide the following information:
|
||||
|
||||
**A description of the problem**
|
||||
Describe the problem here.
|
||||
|
||||
**cheat version info**
|
||||
Please paste the output of `cheat -v` here.
|
||||
|
||||
**cheat configuration info**
|
||||
If your bug pertains to how cheatsheets are loaded and/or displayed, please
|
||||
paste here the following information:
|
||||
|
||||
1. The output of `cheat -d`
|
||||
2. The contents of your `conf.yml` file
|
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
20
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@ -0,0 +1,20 @@
|
||||
---
|
||||
name: Feature request
|
||||
about: Suggest an idea for this project
|
||||
title: ''
|
||||
labels: 'enhancement'
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
11
.github/dependabot.yml
vendored
Normal file
11
.github/dependabot.yml
vendored
Normal file
@ -0,0 +1,11 @@
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: gomod
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: daily
|
||||
open-pull-requests-limit: 10
|
||||
ignore:
|
||||
- dependency-name: github.com/alecthomas/chroma
|
||||
versions:
|
||||
- 0.9.1
|
57
.github/workflows/build.yml
vendored
Normal file
57
.github/workflows/build.yml
vendored
Normal file
@ -0,0 +1,57 @@
|
||||
name: Go
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master ]
|
||||
pull_request:
|
||||
branches: [ master ]
|
||||
|
||||
jobs:
|
||||
# TODO: is it possible to DRY out these jobs? Aside from `runs-on`, they are
|
||||
# identical.
|
||||
build-linux:
|
||||
runs-on: [ ubuntu-latest ]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Set up Go
|
||||
uses: actions/setup-go@v2
|
||||
with:
|
||||
go-version: 1.16
|
||||
|
||||
- name: Set up Revive (linter)
|
||||
run: go get -u github.com/boyter/scc github.com/mgechev/revive
|
||||
env:
|
||||
GO111MODULE: off
|
||||
|
||||
|
||||
- name: Build
|
||||
run: make build
|
||||
|
||||
- name: Test
|
||||
run: make test
|
||||
|
||||
build-osx:
|
||||
runs-on: [ macos-latest ]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Set up Go
|
||||
uses: actions/setup-go@v2
|
||||
with:
|
||||
go-version: 1.16
|
||||
|
||||
- name: Set up Revive (linter)
|
||||
run: go get -u github.com/boyter/scc github.com/mgechev/revive
|
||||
env:
|
||||
GO111MODULE: off
|
||||
|
||||
- name: Build
|
||||
run: make build
|
||||
|
||||
- name: Test
|
||||
run: make test
|
||||
|
||||
# TODO: windows
|
36
.github/workflows/codeql-analysis.yml
vendored
Normal file
36
.github/workflows/codeql-analysis.yml
vendored
Normal file
@ -0,0 +1,36 @@
|
||||
name: CodeQL
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master ]
|
||||
|
||||
pull_request:
|
||||
branches: [ master ]
|
||||
|
||||
schedule:
|
||||
- cron: '45 23 * * 0'
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
language: [ 'go' ]
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v2
|
||||
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v1
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
|
||||
- name: Autobuild
|
||||
uses: github/codeql-action/autobuild@v1
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v1
|
17
.github/workflows/homebrew.yml
vendored
Normal file
17
.github/workflows/homebrew.yml
vendored
Normal file
@ -0,0 +1,17 @@
|
||||
name: homebrew
|
||||
|
||||
on:
|
||||
push:
|
||||
tags: '*'
|
||||
|
||||
jobs:
|
||||
homebrew:
|
||||
name: Bump Homebrew formula
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: mislav/bump-homebrew-formula-action@v1
|
||||
with:
|
||||
# A PR will be sent to github.com/Homebrew/homebrew-core to update this formula:
|
||||
formula-name: cheat
|
||||
env:
|
||||
COMMITTER_TOKEN: ${{ secrets.COMMITTER_TOKEN }}
|
1
.gitignore
vendored
1
.gitignore
vendored
@ -1 +1,2 @@
|
||||
dist
|
||||
tags
|
||||
|
8
Dockerfile
Normal file
8
Dockerfile
Normal file
@ -0,0 +1,8 @@
|
||||
# NB: this image isn't used anywhere in the build pipeline. It exists to
|
||||
# conveniently facilitate ad-hoc experimentation in a sandboxed environment
|
||||
# during development.
|
||||
FROM golang:1.15-alpine
|
||||
|
||||
RUN apk add git less make
|
||||
|
||||
WORKDIR /app
|
195
Makefile
Normal file
195
Makefile
Normal file
@ -0,0 +1,195 @@
|
||||
# paths
|
||||
makefile := $(realpath $(lastword $(MAKEFILE_LIST)))
|
||||
cmd_dir := ./cmd/cheat
|
||||
dist_dir := ./dist
|
||||
|
||||
# executables
|
||||
CAT := cat
|
||||
COLUMN := column
|
||||
CTAGS := ctags
|
||||
DOCKER := docker
|
||||
GO := go
|
||||
GREP := grep
|
||||
GZIP := gzip --best
|
||||
LINT := revive
|
||||
MAN := man
|
||||
MKDIR := mkdir -p
|
||||
PANDOC := pandoc
|
||||
RM := rm
|
||||
SCC := scc
|
||||
SED := sed
|
||||
SORT := sort
|
||||
ZIP := zip -m
|
||||
|
||||
docker_image := cheat-devel:latest
|
||||
|
||||
# build flags
|
||||
BUILD_FLAGS := -ldflags="-s -w" -mod vendor -trimpath
|
||||
GOBIN :=
|
||||
TMPDIR := /tmp
|
||||
|
||||
# release binaries
|
||||
releases := \
|
||||
$(dist_dir)/cheat-darwin-amd64 \
|
||||
$(dist_dir)/cheat-linux-386 \
|
||||
$(dist_dir)/cheat-linux-amd64 \
|
||||
$(dist_dir)/cheat-linux-arm5 \
|
||||
$(dist_dir)/cheat-linux-arm6 \
|
||||
$(dist_dir)/cheat-linux-arm7 \
|
||||
$(dist_dir)/cheat-linux-arm64 \
|
||||
$(dist_dir)/cheat-windows-amd64.exe
|
||||
|
||||
## build: build an executable for your architecture
|
||||
.PHONY: build
|
||||
build: $(dist_dir) clean fmt lint vet vendor generate man
|
||||
$(GO) build $(BUILD_FLAGS) -o $(dist_dir)/cheat $(cmd_dir)
|
||||
|
||||
## build-release: build release executables
|
||||
.PHONY: build-release
|
||||
build-release: $(releases)
|
||||
|
||||
# cheat-darwin-amd64
|
||||
$(dist_dir)/cheat-darwin-amd64: prepare
|
||||
GOARCH=amd64 GOOS=darwin \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-386
|
||||
$(dist_dir)/cheat-linux-386: prepare
|
||||
GOARCH=386 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-amd64
|
||||
$(dist_dir)/cheat-linux-amd64: prepare
|
||||
GOARCH=amd64 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm5
|
||||
$(dist_dir)/cheat-linux-arm5: prepare
|
||||
GOARCH=arm GOOS=linux GOARM=5 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm6
|
||||
$(dist_dir)/cheat-linux-arm6: prepare
|
||||
GOARCH=arm GOOS=linux GOARM=6 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm7
|
||||
$(dist_dir)/cheat-linux-arm7: prepare
|
||||
GOARCH=arm GOOS=linux GOARM=7 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm64
|
||||
$(dist_dir)/cheat-linux-arm64: prepare
|
||||
GOARCH=arm64 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-windows-amd64
|
||||
$(dist_dir)/cheat-windows-amd64.exe: prepare
|
||||
GOARCH=amd64 GOOS=windows \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(ZIP) $@.zip $@
|
||||
|
||||
# ./dist
|
||||
$(dist_dir):
|
||||
$(MKDIR) $(dist_dir)
|
||||
|
||||
.PHONY: generate
|
||||
generate:
|
||||
$(GO) generate $(cmd_dir)
|
||||
|
||||
## install: build and install cheat on your PATH
|
||||
.PHONY: install
|
||||
install: build
|
||||
$(GO) install $(BUILD_FLAGS) $(GOBIN) $(cmd_dir)
|
||||
|
||||
## clean: remove compiled executables
|
||||
.PHONY: clean
|
||||
clean: $(dist_dir)
|
||||
$(RM) -f $(dist_dir)/*
|
||||
|
||||
## distclean: remove the tags file
|
||||
.PHONY: distclean
|
||||
distclean:
|
||||
$(RM) -f tags
|
||||
@$(DOCKER) image rm -f $(docker_image)
|
||||
|
||||
## setup: install revive (linter) and scc (sloc tool)
|
||||
.PHONY: setup
|
||||
setup:
|
||||
GO111MODULE=off $(GO) get -u github.com/boyter/scc github.com/mgechev/revive
|
||||
|
||||
## sloc: count "semantic lines of code"
|
||||
.PHONY: sloc
|
||||
sloc:
|
||||
$(SCC) --exclude-dir=vendor
|
||||
|
||||
## tags: build a tags file
|
||||
.PHONY: tags
|
||||
tags:
|
||||
$(CTAGS) -R --exclude=vendor --languages=go
|
||||
|
||||
## man: build a man page
|
||||
# NB: pandoc may not be installed, so we're ignoring this error on failure
|
||||
.PHONY: man
|
||||
man:
|
||||
-$(PANDOC) -s -t man doc/cheat.1.md -o doc/cheat.1
|
||||
|
||||
## vendor: download, tidy, and verify dependencies
|
||||
.PHONY: vendor
|
||||
vendor:
|
||||
$(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify
|
||||
|
||||
## vendor-update: update vendored dependencies
|
||||
vendor-update:
|
||||
$(GO) get -t -u ./... && $(GO) mod vendor
|
||||
|
||||
## fmt: run go fmt
|
||||
.PHONY: fmt
|
||||
fmt:
|
||||
$(GO) fmt ./...
|
||||
|
||||
## lint: lint go source files
|
||||
.PHONY: lint
|
||||
lint: vendor
|
||||
$(LINT) -exclude vendor/... ./...
|
||||
|
||||
## vet: vet go source files
|
||||
.PHONY: vet
|
||||
vet:
|
||||
$(GO) vet ./...
|
||||
|
||||
## test: run unit-tests
|
||||
.PHONY: test
|
||||
test:
|
||||
$(GO) test ./...
|
||||
|
||||
## coverage: generate a test coverage report
|
||||
.PHONY: coverage
|
||||
coverage:
|
||||
$(GO) test ./... -coverprofile=$(TMPDIR)/cheat-coverage.out && \
|
||||
$(GO) tool cover -html=$(TMPDIR)/cheat-coverage.out
|
||||
|
||||
## check: format, lint, vet, vendor, and run unit-tests
|
||||
.PHONY: check
|
||||
check: | vendor fmt lint vet test
|
||||
|
||||
.PHONY: prepare
|
||||
prepare: | $(dist_dir) clean generate vendor fmt lint vet test
|
||||
|
||||
## docker-setup: create a docker image for use during development
|
||||
.PHONY: docker-setup
|
||||
docker-setup:
|
||||
$(DOCKER) build -t $(docker_image) -f Dockerfile .
|
||||
|
||||
## docker-sh: shell into the docker development container
|
||||
.PHONY: docker-sh
|
||||
docker-sh:
|
||||
$(DOCKER) run -v $(shell pwd):/app -ti $(docker_image) /bin/ash
|
||||
|
||||
## help: display this help text
|
||||
.PHONY: help
|
||||
help:
|
||||
@$(CAT) $(makefile) | \
|
||||
$(SORT) | \
|
||||
$(GREP) "^##" | \
|
||||
$(SED) 's/## //g' | \
|
||||
$(COLUMN) -t -s ':'
|
61
README.md
61
README.md
@ -1,5 +1,9 @@
|
||||

|
||||
|
||||
|
||||
cheat
|
||||
=====
|
||||
|
||||
`cheat` allows you to create and view interactive cheatsheets on the
|
||||
command-line. It was designed to help remind \*nix system administrators of
|
||||
options for commands that they use frequently, but not frequently enough to
|
||||
@ -22,7 +26,7 @@ cheat tar
|
||||
You will be presented with a cheatsheet resembling the following:
|
||||
|
||||
```sh
|
||||
# To extract an uncompressed archive:
|
||||
# To extract an uncompressed archive:
|
||||
tar -xvf '/path/to/foo.tar'
|
||||
|
||||
# To extract a .gz archive:
|
||||
@ -44,15 +48,17 @@ Installing
|
||||
`cheat` has no dependencies. To install it, download the executable from the
|
||||
[releases][] page and place it on your `PATH`.
|
||||
|
||||
Alternatively, if you have [go][] installed, you may install `cheat` using `go
|
||||
get`:
|
||||
|
||||
```sh
|
||||
go get -u github.com/cheat/cheat/cmd/cheat
|
||||
```
|
||||
|
||||
Configuring
|
||||
-----------
|
||||
### conf.yml ###
|
||||
`cheat` is configured by a YAML file that can be generated with `cheat --init`:
|
||||
|
||||
```sh
|
||||
mkdir -p ~/.config/cheat && cheat --init > ~/.config/cheat/conf.yml
|
||||
```
|
||||
`cheat` is configured by a YAML file that will be auto-generated on first run.
|
||||
|
||||
By default, the config file is assumed to exist on an XDG-compliant
|
||||
configuration path like `~/.config/cheat/conf.yml`. If you would like to store
|
||||
@ -86,13 +92,29 @@ const squares = [1, 2, 3, 4].map(x => x * x);
|
||||
```
|
||||
|
||||
The `cheat` executable includes no cheatsheets, but [community-sourced
|
||||
cheatsheets are available][cheatsheets].
|
||||
cheatsheets are available][cheatsheets]. You will be asked if you would like to
|
||||
install the community-sourced cheatsheets the first time you run `cheat`.
|
||||
|
||||
### Script ###
|
||||
You can manage the cheatsheets via a script `cheatsheets`.
|
||||
|
||||
#### Download and install ####
|
||||
```sh
|
||||
mkdir -p ~/.local/bin
|
||||
wget -O ~/.local/bin/cheatsheets https://raw.githubusercontent.com/cheat/cheat/master/scripts/git/cheatsheets
|
||||
chmod +x ~/.local/bin/cheatsheets
|
||||
```
|
||||
|
||||
#### Pull changes ####
|
||||
To pull the community and personal cheatsheets call `cheatsheets pull`
|
||||
|
||||
#### Push changes ####
|
||||
To push your personal cheatsheets call `cheatsheets push`
|
||||
|
||||
Cheatpaths
|
||||
----------
|
||||
Cheatsheets are stored on "cheatpaths", which are directories that contain
|
||||
cheetsheets. Cheatpaths are specified in the `conf.yml` file.
|
||||
cheatsheets. Cheatpaths are specified in the `conf.yml` file.
|
||||
|
||||
It can be useful to configure `cheat` against multiple cheatpaths. A common
|
||||
pattern is to store cheatsheets from multiple repositories on individual
|
||||
@ -122,6 +144,11 @@ If a user attempts to edit a cheatsheet on a read-only cheatpath, `cheat` will
|
||||
transparently copy that sheet to a writeable directory before opening it for
|
||||
editing.
|
||||
|
||||
### Directory-scoped Cheatpaths ###
|
||||
At times, it can be useful to closely associate cheatsheets with a directory on
|
||||
your filesystem. `cheat` facilitates this by searching for a `.cheat` folder in
|
||||
the current working directory. If found, the `.cheat` directory will
|
||||
(temporarily) be added to the cheatpaths.
|
||||
|
||||
Usage
|
||||
-----
|
||||
@ -175,21 +202,29 @@ To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -r -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
```
|
||||
|
||||
Flags may be combined in inuitive ways. Example: to search sheets on the
|
||||
Flags may be combined in intuitive ways. Example: to search sheets on the
|
||||
"personal" cheatpath that are tagged with "networking" and match a regex:
|
||||
|
||||
```sh
|
||||
cheat -p personal -t networking -s --regex '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
cheat -p personal -t networking --regex -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
```
|
||||
|
||||
|
||||
Advanced Usage
|
||||
--------------
|
||||
`cheat` may be integrated with [fzf][]. See [fzf.bash][bash] for instructions.
|
||||
(Support for other shells will be added in future releases.)
|
||||
Shell autocompletion is currently available for `bash`, `fish`, and `zsh`. Copy
|
||||
the relevant [completion script][completions] into the appropriate directory on
|
||||
your filesystem to enable autocompletion. (This directory will vary depending
|
||||
on operating system and shell specifics.)
|
||||
|
||||
Additionally, `cheat` supports enhanced autocompletion via integration with
|
||||
[fzf][]. To enable `fzf` integration:
|
||||
|
||||
1. Ensure that `fzf` is available on your `$PATH`
|
||||
2. Set an envvar: `export CHEAT_USE_FZF=true`
|
||||
|
||||
[Releases]: https://github.com/cheat/cheat/releases
|
||||
[bash]: https://github.com/cheat/cheat/blob/master/scripts/fzf.bash
|
||||
[cheatsheets]: https://github.com/cheat/cheatsheets
|
||||
[completions]: https://github.com/cheat/cheat/tree/master/scripts
|
||||
[fzf]: https://github.com/junegunn/fzf
|
||||
[go]: https://golang.org
|
||||
|
@ -1,13 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# locate the cheat project root
|
||||
BINDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
APPDIR=$(readlink -f "$BINDIR/..")
|
||||
|
||||
# compile the executable
|
||||
cd "$APPDIR/cmd/cheat"
|
||||
go clean && go generate && go build
|
||||
mv "$APPDIR/cmd/cheat/cheat" "$APPDIR/dist/cheat"
|
||||
|
||||
# display a build checksum
|
||||
md5sum "$APPDIR/dist/cheat"
|
@ -1,14 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# locate the cheat project root
|
||||
BINDIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
APPDIR=$(readlink -f "$BINDIR/..")
|
||||
|
||||
# build embeds
|
||||
cd "$APPDIR/cmd/cheat"
|
||||
go clean && go generate
|
||||
|
||||
# compile AMD64 for Linux, OSX, and Windows
|
||||
env GOOS=darwin GOARCH=amd64 go build -o "$APPDIR/dist/cheat-darwin-amd64" "$APPDIR/cmd/cheat"
|
||||
env GOOS=linux GOARCH=amd64 go build -o "$APPDIR/dist/cheat-linux-amd64" "$APPDIR/cmd/cheat"
|
||||
env GOOS=windows GOARCH=amd64 go build -o "$APPDIR/dist/cheat-win-amd64.exe" "$APPDIR/cmd/cheat"
|
@ -1,18 +1,20 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"os"
|
||||
"text/tabwriter"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/display"
|
||||
)
|
||||
|
||||
// cmdDirectories lists the configured cheatpaths.
|
||||
func cmdDirectories(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
// initialize a tabwriter to produce cleanly columnized output
|
||||
w := tabwriter.NewWriter(os.Stdout, 0, 0, 1, ' ', 0)
|
||||
var out bytes.Buffer
|
||||
w := tabwriter.NewWriter(&out, 0, 0, 1, ' ', 0)
|
||||
|
||||
// generate sorted, columnized output
|
||||
for _, path := range conf.Cheatpaths {
|
||||
@ -25,4 +27,5 @@ func cmdDirectories(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
// write columnized output to stdout
|
||||
w.Flush()
|
||||
display.Write(out.String(), conf)
|
||||
}
|
||||
|
@ -99,8 +99,15 @@ func cmdEdit(opts map[string]interface{}, conf config.Config) {
|
||||
}
|
||||
}
|
||||
|
||||
// split `conf.Editor` into parts to separate the editor's executable from
|
||||
// any arguments it may have been passed. If this is not done, the nearby
|
||||
// call to `exec.Command` will fail.
|
||||
parts := strings.Fields(conf.Editor)
|
||||
editor := parts[0]
|
||||
args := append(parts[1:], editpath)
|
||||
|
||||
// edit the cheatsheet
|
||||
cmd := exec.Command(conf.Editor, editpath)
|
||||
cmd := exec.Command(editor, args...)
|
||||
cmd.Stdout = os.Stdout
|
||||
cmd.Stdin = os.Stdin
|
||||
cmd.Stderr = os.Stderr
|
||||
|
@ -2,9 +2,56 @@ package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path"
|
||||
"runtime"
|
||||
"strings"
|
||||
|
||||
"github.com/mitchellh/go-homedir"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// cmdInit displays an example config file.
|
||||
func cmdInit() {
|
||||
fmt.Println(configs())
|
||||
|
||||
// get the user's home directory
|
||||
home, err := homedir.Dir()
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to get user home directory: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// read the envvars into a map of strings
|
||||
envvars := map[string]string{}
|
||||
for _, e := range os.Environ() {
|
||||
pair := strings.SplitN(e, "=", 2)
|
||||
envvars[pair[0]] = pair[1]
|
||||
}
|
||||
|
||||
// load the config template
|
||||
configs := configs()
|
||||
|
||||
// identify the os-specifc paths at which configs may be located
|
||||
confpaths, err := config.Paths(runtime.GOOS, home, envvars)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to read config paths: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// determine the appropriate paths for config data and (optional) community
|
||||
// cheatsheets based on the user's platform
|
||||
confpath := confpaths[0]
|
||||
confdir := path.Dir(confpath)
|
||||
|
||||
// create paths for community and personal cheatsheets
|
||||
community := path.Join(confdir, "/cheatsheets/community")
|
||||
personal := path.Join(confdir, "/cheatsheets/personal")
|
||||
|
||||
// template the above paths into the default configs
|
||||
configs = strings.Replace(configs, "COMMUNITY_PATH", community, -1)
|
||||
configs = strings.Replace(configs, "PERSONAL_PATH", personal, -1)
|
||||
|
||||
// output the templated configs
|
||||
fmt.Println(configs)
|
||||
}
|
||||
|
@ -1,13 +1,16 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"os"
|
||||
"regexp"
|
||||
"sort"
|
||||
"strings"
|
||||
"text/tabwriter"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/display"
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
@ -34,8 +37,8 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
|
||||
// sheets with local sheets), here we simply want to create a slice
|
||||
// containing all sheets.
|
||||
flattened := []sheet.Sheet{}
|
||||
for _, pathSheets := range cheatsheets {
|
||||
for _, s := range pathSheets {
|
||||
for _, pathsheets := range cheatsheets {
|
||||
for _, s := range pathsheets {
|
||||
flattened = append(flattened, s)
|
||||
}
|
||||
}
|
||||
@ -45,16 +48,52 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
|
||||
return flattened[i].Title < flattened[j].Title
|
||||
})
|
||||
|
||||
// exit early if no cheatsheets are available
|
||||
// filter if <cheatsheet> was specified
|
||||
// NB: our docopt specification is misleading here. When used in conjunction
|
||||
// with `-l`, `<cheatsheet>` is really a pattern against which to filter
|
||||
// sheet titles.
|
||||
if opts["<cheatsheet>"] != nil {
|
||||
|
||||
// initialize a slice of filtered sheets
|
||||
filtered := []sheet.Sheet{}
|
||||
|
||||
// initialize our filter pattern
|
||||
pattern := "(?i)" + opts["<cheatsheet>"].(string)
|
||||
|
||||
// compile the regex
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Fprintln(
|
||||
os.Stderr,
|
||||
fmt.Sprintf("failed to compile regexp: %s, %v", pattern, err),
|
||||
)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// iterate over each cheatsheet, and pass-through those which match the
|
||||
// filter pattern
|
||||
for _, s := range flattened {
|
||||
if reg.MatchString(s.Title) {
|
||||
filtered = append(filtered, s)
|
||||
}
|
||||
}
|
||||
|
||||
flattened = filtered
|
||||
}
|
||||
|
||||
// return exit code 2 if no cheatsheets are available
|
||||
if len(flattened) == 0 {
|
||||
os.Exit(0)
|
||||
os.Exit(2)
|
||||
}
|
||||
|
||||
// initialize a tabwriter to produce cleanly columnized output
|
||||
w := tabwriter.NewWriter(os.Stdout, 0, 0, 1, ' ', 0)
|
||||
var out bytes.Buffer
|
||||
w := tabwriter.NewWriter(&out, 0, 0, 1, ' ', 0)
|
||||
|
||||
// write a header row
|
||||
fmt.Fprintln(w, "title:\tfile:\ttags:")
|
||||
|
||||
// generate sorted, columnized output
|
||||
fmt.Fprintln(w, "title:\tfile:\ttags:")
|
||||
for _, sheet := range flattened {
|
||||
fmt.Fprintln(w, fmt.Sprintf(
|
||||
"%s\t%s\t%s",
|
||||
@ -66,4 +105,5 @@ func cmdList(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
// write columnized output to stdout
|
||||
w.Flush()
|
||||
display.Write(out.String(), conf)
|
||||
}
|
||||
|
55
cmd/cheat/cmd_remove.go
Normal file
55
cmd/cheat/cmd_remove.go
Normal file
@ -0,0 +1,55 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
|
||||
// cmdRemove opens a cheatsheet for editing (or creates it if it doesn't exist).
|
||||
func cmdRemove(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
cheatsheet := opts["--rm"].(string)
|
||||
|
||||
// load the cheatsheets
|
||||
cheatsheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// filter cheatcheats by tag if --tag was provided
|
||||
if opts["--tag"] != nil {
|
||||
cheatsheets = sheets.Filter(
|
||||
cheatsheets,
|
||||
strings.Split(opts["--tag"].(string), ","),
|
||||
)
|
||||
}
|
||||
|
||||
// consolidate the cheatsheets found on all paths into a single map of
|
||||
// `title` => `sheet` (ie, allow more local cheatsheets to override less
|
||||
// local cheatsheets)
|
||||
consolidated := sheets.Consolidate(cheatsheets)
|
||||
|
||||
// fail early if the requested cheatsheet does not exist
|
||||
sheet, ok := consolidated[cheatsheet]
|
||||
if !ok {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("No cheatsheet found for '%s'.\n", cheatsheet))
|
||||
os.Exit(2)
|
||||
}
|
||||
|
||||
// fail early if the sheet is read-only
|
||||
if sheet.ReadOnly {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("cheatsheet '%s' is read-only.", cheatsheet))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// otherwise, attempt to delete the sheet
|
||||
if err := os.Remove(sheet.Path); err != nil {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to delete sheet: %s, %v", sheet.Title, err))
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
@ -7,6 +7,7 @@ import (
|
||||
"strings"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/display"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
|
||||
@ -30,46 +31,65 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
|
||||
)
|
||||
}
|
||||
|
||||
// consolidate the cheatsheets found on all paths into a single map of
|
||||
// `title` => `sheet` (ie, allow more local cheatsheets to override less
|
||||
// local cheatsheets)
|
||||
consolidated := sheets.Consolidate(cheatsheets)
|
||||
// iterate over each cheatpath
|
||||
out := ""
|
||||
for _, pathcheats := range cheatsheets {
|
||||
|
||||
// sort the cheatsheets alphabetically, and search for matches
|
||||
for _, sheet := range sheets.Sort(consolidated) {
|
||||
// sort the cheatsheets alphabetically, and search for matches
|
||||
for _, sheet := range sheets.Sort(pathcheats) {
|
||||
|
||||
// colorize output?
|
||||
colorize := false
|
||||
if conf.Colorize == true || opts["--colorize"] == true {
|
||||
colorize = true
|
||||
}
|
||||
|
||||
// assume that we want to perform a case-insensitive search for <phrase>
|
||||
pattern := "(?i)" + phrase
|
||||
|
||||
// unless --regex is provided, in which case we pass the regex unaltered
|
||||
if opts["--regex"] == true {
|
||||
pattern = phrase
|
||||
}
|
||||
|
||||
// compile the regex
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Errorf("failed to compile regexp: %s, %v", pattern, err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// search the sheet
|
||||
matches := sheet.Search(reg, colorize)
|
||||
|
||||
// display the results
|
||||
if len(matches) > 0 {
|
||||
fmt.Printf("%s:\n", sheet.Title)
|
||||
for _, m := range matches {
|
||||
fmt.Printf(" %d: %s\n", m.Line, m.Text)
|
||||
// if <cheatsheet> was provided, constrain the search only to
|
||||
// matching cheatsheets
|
||||
if opts["<cheatsheet>"] != nil && sheet.Title != opts["<cheatsheet>"] {
|
||||
continue
|
||||
}
|
||||
fmt.Print("\n")
|
||||
|
||||
// assume that we want to perform a case-insensitive search for <phrase>
|
||||
pattern := "(?i)" + phrase
|
||||
|
||||
// unless --regex is provided, in which case we pass the regex unaltered
|
||||
if opts["--regex"] == true {
|
||||
pattern = phrase
|
||||
}
|
||||
|
||||
// compile the regex
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to compile regexp: %s, %v", pattern, err))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// `Search` will return text entries that match the search terms. We're
|
||||
// using it here to overwrite the prior cheatsheet Text, filtering it to
|
||||
// only what is relevant
|
||||
sheet.Text = sheet.Search(reg)
|
||||
|
||||
// if the sheet did not match the search, ignore it and move on
|
||||
if sheet.Text == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
// if colorization was requested, apply it here
|
||||
if conf.Color(opts) {
|
||||
sheet.Colorize(conf)
|
||||
}
|
||||
|
||||
// display the cheatsheet title and path
|
||||
out += fmt.Sprintf("%s %s\n",
|
||||
display.Underline(sheet.Title),
|
||||
display.Faint(fmt.Sprintf("(%s)", sheet.CheatPath), conf),
|
||||
)
|
||||
|
||||
// indent each line of content
|
||||
out += display.Indent(sheet.Text) + "\n"
|
||||
}
|
||||
}
|
||||
|
||||
// trim superfluous newlines
|
||||
out = strings.TrimSpace(out)
|
||||
|
||||
// display the output
|
||||
// NB: resist the temptation to call `display.Display` multiple times in
|
||||
// the loop above. That will not play nicely with the paginator.
|
||||
display.Write(out, conf)
|
||||
}
|
||||
|
30
cmd/cheat/cmd_tags.go
Normal file
30
cmd/cheat/cmd_tags.go
Normal file
@ -0,0 +1,30 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/display"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
|
||||
// cmdTags lists all tags in use.
|
||||
func cmdTags(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
// load the cheatsheets
|
||||
cheatsheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to list cheatsheets: %v", err))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// assemble the output
|
||||
out := ""
|
||||
for _, tag := range sheets.Tags(cheatsheets) {
|
||||
out += fmt.Sprintln(tag)
|
||||
}
|
||||
|
||||
// display the output
|
||||
display.Write(out, conf)
|
||||
}
|
@ -5,9 +5,8 @@ import (
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/chroma/quick"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/display"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
|
||||
@ -31,42 +30,53 @@ func cmdView(opts map[string]interface{}, conf config.Config) {
|
||||
)
|
||||
}
|
||||
|
||||
// consolidate the cheatsheets found on all paths into a single map of
|
||||
// `title` => `sheet` (ie, allow more local cheatsheets to override less
|
||||
// local cheatsheets)
|
||||
// if --all was passed, display cheatsheets from all cheatpaths
|
||||
if opts["--all"].(bool) {
|
||||
// iterate over the cheatpaths
|
||||
out := ""
|
||||
for _, cheatpath := range cheatsheets {
|
||||
|
||||
// if the cheatpath contains the specified cheatsheet, display it
|
||||
if sheet, ok := cheatpath[cheatsheet]; ok {
|
||||
|
||||
// identify the matching cheatsheet
|
||||
out += fmt.Sprintf("%s %s\n",
|
||||
display.Underline(sheet.Title),
|
||||
display.Faint(fmt.Sprintf("(%s)", sheet.CheatPath), conf),
|
||||
)
|
||||
|
||||
// apply colorization if requested
|
||||
if conf.Color(opts) {
|
||||
sheet.Colorize(conf)
|
||||
}
|
||||
|
||||
// display the cheatsheet
|
||||
out += display.Indent(sheet.Text) + "\n"
|
||||
}
|
||||
}
|
||||
|
||||
// display and exit
|
||||
display.Write(strings.TrimSuffix(out, "\n"), conf)
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// otherwise, consolidate the cheatsheets found on all paths into a single
|
||||
// map of `title` => `sheet` (ie, allow more local cheatsheets to override
|
||||
// less local cheatsheets)
|
||||
consolidated := sheets.Consolidate(cheatsheets)
|
||||
|
||||
// fail early if the requested cheatsheet does not exist
|
||||
sheet, ok := consolidated[cheatsheet]
|
||||
if !ok {
|
||||
fmt.Printf("No cheatsheet found for '%s'.\n", cheatsheet)
|
||||
os.Exit(0)
|
||||
os.Exit(2)
|
||||
}
|
||||
|
||||
// if colorization is not desired, output un-colorized text and exit
|
||||
if conf.Colorize == false && opts["--colorize"] == false {
|
||||
fmt.Print(sheet.Text)
|
||||
os.Exit(0)
|
||||
// apply colorization if requested
|
||||
if conf.Color(opts) {
|
||||
sheet.Colorize(conf)
|
||||
}
|
||||
|
||||
// otherwise, colorize the output
|
||||
// if the syntax was not specified, default to bash
|
||||
lex := sheet.Syntax
|
||||
if lex == "" {
|
||||
lex = "bash"
|
||||
}
|
||||
|
||||
// apply syntax highlighting
|
||||
err = quick.Highlight(
|
||||
os.Stdout,
|
||||
sheet.Text,
|
||||
lex,
|
||||
conf.Formatter,
|
||||
conf.Style,
|
||||
)
|
||||
|
||||
// if colorization somehow failed, output non-colorized text
|
||||
if err != nil {
|
||||
fmt.Print(sheet.Text)
|
||||
}
|
||||
// display the cheatsheet
|
||||
display.Write(sheet.Text, conf)
|
||||
}
|
||||
|
@ -2,16 +2,19 @@ Usage:
|
||||
cheat [options] [<cheatsheet>]
|
||||
|
||||
Options:
|
||||
--init Write a default config file to stdout
|
||||
-c --colorize Colorize output
|
||||
-d --directories List cheatsheet directories
|
||||
-e --edit=<sheet> Edit cheatsheet
|
||||
-l --list List cheatsheets
|
||||
-p --path=<name> Return only sheets found on path <name>
|
||||
-r --regex Treat search <phrase> as a regex
|
||||
-s --search=<phrase> Search cheatsheets for <phrase>
|
||||
-t --tag=<tag> Return only sheets matching <tag>
|
||||
-v --version Print the version number
|
||||
--init Write a default config file to stdout
|
||||
-a --all Search among all cheatpaths
|
||||
-c --colorize Colorize output
|
||||
-d --directories List cheatsheet directories
|
||||
-e --edit=<cheatsheet> Edit <cheatsheet>
|
||||
-l --list List cheatsheets
|
||||
-p --path=<name> Return only sheets found on cheatpath <name>
|
||||
-r --regex Treat search <phrase> as a regex
|
||||
-s --search=<phrase> Search cheatsheets for <phrase>
|
||||
-t --tag=<tag> Return only sheets matching <tag>
|
||||
-T --tags List all tags in use
|
||||
-v --version Print the version number
|
||||
--rm=<cheatsheet> Remove (delete) <cheatsheet>
|
||||
|
||||
Examples:
|
||||
|
||||
@ -33,6 +36,12 @@ Examples:
|
||||
To list all available cheatsheets:
|
||||
cheat -l
|
||||
|
||||
To list all cheatsheets whose titles match "apt":
|
||||
cheat -l apt
|
||||
|
||||
To list all tags in use:
|
||||
cheat -T
|
||||
|
||||
To list available cheatsheets that are tagged as "personal":
|
||||
cheat -l -t personal
|
||||
|
||||
@ -41,3 +50,6 @@ Examples:
|
||||
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -c -r -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
cheat --rm foo/bar
|
||||
|
@ -6,14 +6,17 @@ import (
|
||||
"fmt"
|
||||
"os"
|
||||
"runtime"
|
||||
"strings"
|
||||
|
||||
"github.com/docopt/docopt-go"
|
||||
"github.com/mitchellh/go-homedir"
|
||||
|
||||
"github.com/cheat/cheat/internal/cheatpath"
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/installer"
|
||||
)
|
||||
|
||||
const version = "3.0.1"
|
||||
const version = "4.2.2"
|
||||
|
||||
func main() {
|
||||
|
||||
@ -31,15 +34,62 @@ func main() {
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// load the config file
|
||||
confpath, err := config.Path(runtime.GOOS)
|
||||
// get the user's home directory
|
||||
home, err := homedir.Dir()
|
||||
if err != nil {
|
||||
fmt.Fprintln(os.Stderr, "could not locate config file")
|
||||
fmt.Fprintf(os.Stderr, "failed to get user home directory: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// read the envvars into a map of strings
|
||||
envvars := map[string]string{}
|
||||
for _, e := range os.Environ() {
|
||||
pair := strings.SplitN(e, "=", 2)
|
||||
envvars[pair[0]] = pair[1]
|
||||
}
|
||||
|
||||
// identify the os-specifc paths at which configs may be located
|
||||
confpaths, err := config.Paths(runtime.GOOS, home, envvars)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to load config: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// search for the config file in the above paths
|
||||
confpath, err := config.Path(confpaths)
|
||||
if err != nil {
|
||||
// prompt the user to create a config file
|
||||
yes, err := installer.Prompt(
|
||||
"A config file was not found. Would you like to create one now? [Y/n]",
|
||||
true,
|
||||
)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to create config: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// exit early on a negative answer
|
||||
if !yes {
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// choose a confpath
|
||||
confpath = confpaths[0]
|
||||
|
||||
// run the installer
|
||||
if err := installer.Run(configs(), confpath); err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to run installer: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// notify the user and exit
|
||||
fmt.Printf("Created config file: %s\n", confpath)
|
||||
fmt.Println("Please read this file for advanced configuration information.")
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// initialize the configs
|
||||
conf, err := config.New(opts, confpath)
|
||||
conf, err := config.New(opts, confpath, true)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to load config: %v\n", err)
|
||||
os.Exit(1)
|
||||
@ -76,12 +126,21 @@ func main() {
|
||||
case opts["--list"].(bool):
|
||||
cmd = cmdList
|
||||
|
||||
case opts["--tags"].(bool):
|
||||
cmd = cmdTags
|
||||
|
||||
case opts["--search"] != nil:
|
||||
cmd = cmdSearch
|
||||
|
||||
case opts["--rm"] != nil:
|
||||
cmd = cmdRemove
|
||||
|
||||
case opts["<cheatsheet>"] != nil:
|
||||
cmd = cmdView
|
||||
|
||||
case opts["--tag"] != nil && opts["--tag"].(string) != "":
|
||||
cmd = cmdList
|
||||
|
||||
default:
|
||||
fmt.Println(usage())
|
||||
os.Exit(0)
|
||||
|
@ -23,6 +23,9 @@ style: monokai
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal16m
|
||||
|
||||
# Through which pager should output be piped? (Unset this key for no pager.)
|
||||
pager: less -FRX
|
||||
|
||||
# The paths at which cheatsheets are available. Tags associated with a cheatpath
|
||||
# are automatically attached to all cheatsheets residing on that path.
|
||||
#
|
||||
@ -32,34 +35,46 @@ formatter: terminal16m
|
||||
# "upstream" cheatsheets.
|
||||
#
|
||||
# But what if you want to view the "upstream" cheatsheets instead of your own?
|
||||
# Cheatsheets may be filtered via 'cheat -f <tag>' in combination with other
|
||||
# Cheatsheets may be filtered via 'cheat -t <tag>' in combination with other
|
||||
# commands. So, if you want to view the 'tar' cheatsheet that is tagged as
|
||||
# 'community' rather than your own, you can use: cheat tar -f community
|
||||
# 'community' rather than your own, you can use: cheat tar -t community
|
||||
cheatpaths:
|
||||
|
||||
# Paths that come earlier are considered to be the most "global", and will
|
||||
# thus be overridden by more local cheatsheets. That being the case, you
|
||||
# should probably list community cheatsheets first.
|
||||
#
|
||||
# Note that the paths and tags listed below are just examples. You may freely
|
||||
# Note that the paths and tags listed below are placeholders. You may freely
|
||||
# change them to suit your needs.
|
||||
#
|
||||
# Community cheatsheets must be installed separately, though you may have
|
||||
# downloaded them automatically when installing 'cheat'. If not, you may
|
||||
# download them here:
|
||||
#
|
||||
# https://github.com/cheat/cheatsheets
|
||||
#
|
||||
# Once downloaded, ensure that 'path' below points to the location at which
|
||||
# you downloaded the community cheatsheets.
|
||||
- name: community
|
||||
path: ~/.dotfiles/cheat/community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# Maybe your company or department maintains a repository of cheatsheets as
|
||||
# well. It's probably sensible to list those second.
|
||||
- name: work
|
||||
path: ~/.dotfiles/cheat/work
|
||||
tags: [ work ]
|
||||
readonly: false
|
||||
|
||||
# If you have personalized cheatsheets, list them last. They will take
|
||||
# precedence over the more global cheatsheets.
|
||||
- name: personal
|
||||
path: ~/.dotfiles/cheat/personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# While it requires no configuration here, it's also worth noting that
|
||||
# 'cheat' will automatically append directories named '.cheat' within the
|
||||
# current working directory to the 'cheatpath'. This can be very useful if
|
||||
# you'd like to closely associate cheatsheets with, for example, a directory
|
||||
# containing source code.
|
||||
#
|
||||
# Such "directory-scoped" cheatsheets will be treated as the most "local"
|
||||
# cheatsheets, and will override less "local" cheatsheets. Likewise,
|
||||
# directory-scoped cheatsheets will always be editable ('readonly: false').
|
||||
`)
|
||||
}
|
||||
|
@ -11,16 +11,19 @@ func usage() string {
|
||||
cheat [options] [<cheatsheet>]
|
||||
|
||||
Options:
|
||||
--init Write a default config file to stdout
|
||||
-c --colorize Colorize output
|
||||
-d --directories List cheatsheet directories
|
||||
-e --edit=<sheet> Edit cheatsheet
|
||||
-l --list List cheatsheets
|
||||
-p --path=<name> Return only sheets found on path <name>
|
||||
-r --regex Treat search <phrase> as a regex
|
||||
-s --search=<phrase> Search cheatsheets for <phrase>
|
||||
-t --tag=<tag> Return only sheets matching <tag>
|
||||
-v --version Print the version number
|
||||
--init Write a default config file to stdout
|
||||
-a --all Search among all cheatpaths
|
||||
-c --colorize Colorize output
|
||||
-d --directories List cheatsheet directories
|
||||
-e --edit=<cheatsheet> Edit <cheatsheet>
|
||||
-l --list List cheatsheets
|
||||
-p --path=<name> Return only sheets found on cheatpath <name>
|
||||
-r --regex Treat search <phrase> as a regex
|
||||
-s --search=<phrase> Search cheatsheets for <phrase>
|
||||
-t --tag=<tag> Return only sheets matching <tag>
|
||||
-T --tags List all tags in use
|
||||
-v --version Print the version number
|
||||
--rm=<cheatsheet> Remove (delete) <cheatsheet>
|
||||
|
||||
Examples:
|
||||
|
||||
@ -42,6 +45,12 @@ Examples:
|
||||
To list all available cheatsheets:
|
||||
cheat -l
|
||||
|
||||
To list all cheatsheets whose titles match "apt":
|
||||
cheat -l apt
|
||||
|
||||
To list all tags in use:
|
||||
cheat -T
|
||||
|
||||
To list available cheatsheets that are tagged as "personal":
|
||||
cheat -l -t personal
|
||||
|
||||
@ -50,5 +59,8 @@ Examples:
|
||||
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -c -r -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
cheat --rm foo/bar
|
||||
`)
|
||||
}
|
||||
|
@ -14,6 +14,9 @@ style: monokai
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal16m
|
||||
|
||||
# Through which pager should output be piped? (Unset this key for no pager.)
|
||||
pager: less -FRX
|
||||
|
||||
# The paths at which cheatsheets are available. Tags associated with a cheatpath
|
||||
# are automatically attached to all cheatsheets residing on that path.
|
||||
#
|
||||
@ -23,32 +26,44 @@ formatter: terminal16m
|
||||
# "upstream" cheatsheets.
|
||||
#
|
||||
# But what if you want to view the "upstream" cheatsheets instead of your own?
|
||||
# Cheatsheets may be filtered via 'cheat -f <tag>' in combination with other
|
||||
# Cheatsheets may be filtered via 'cheat -t <tag>' in combination with other
|
||||
# commands. So, if you want to view the 'tar' cheatsheet that is tagged as
|
||||
# 'community' rather than your own, you can use: cheat tar -f community
|
||||
# 'community' rather than your own, you can use: cheat tar -t community
|
||||
cheatpaths:
|
||||
|
||||
# Paths that come earlier are considered to be the most "global", and will
|
||||
# thus be overridden by more local cheatsheets. That being the case, you
|
||||
# should probably list community cheatsheets first.
|
||||
#
|
||||
# Note that the paths and tags listed below are just examples. You may freely
|
||||
# Note that the paths and tags listed below are placeholders. You may freely
|
||||
# change them to suit your needs.
|
||||
#
|
||||
# Community cheatsheets must be installed separately, though you may have
|
||||
# downloaded them automatically when installing 'cheat'. If not, you may
|
||||
# download them here:
|
||||
#
|
||||
# https://github.com/cheat/cheatsheets
|
||||
#
|
||||
# Once downloaded, ensure that 'path' below points to the location at which
|
||||
# you downloaded the community cheatsheets.
|
||||
- name: community
|
||||
path: ~/.dotfiles/cheat/community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# Maybe your company or department maintains a repository of cheatsheets as
|
||||
# well. It's probably sensible to list those second.
|
||||
- name: work
|
||||
path: ~/.dotfiles/cheat/work
|
||||
tags: [ work ]
|
||||
readonly: false
|
||||
|
||||
# If you have personalized cheatsheets, list them last. They will take
|
||||
# precedence over the more global cheatsheets.
|
||||
- name: personal
|
||||
path: ~/.dotfiles/cheat/personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# While it requires no configuration here, it's also worth noting that
|
||||
# 'cheat' will automatically append directories named '.cheat' within the
|
||||
# current working directory to the 'cheatpath'. This can be very useful if
|
||||
# you'd like to closely associate cheatsheets with, for example, a directory
|
||||
# containing source code.
|
||||
#
|
||||
# Such "directory-scoped" cheatsheets will be treated as the most "local"
|
||||
# cheatsheets, and will override less "local" cheatsheets. Likewise,
|
||||
# directory-scoped cheatsheets will always be editable ('readonly: false').
|
||||
|
221
doc/cheat.1
Normal file
221
doc/cheat.1
Normal file
@ -0,0 +1,221 @@
|
||||
.\" Automatically generated by Pandoc 2.2.1
|
||||
.\"
|
||||
.TH "CHEAT" "1" "" "" "General Commands Manual"
|
||||
.hy
|
||||
.SH NAME
|
||||
.PP
|
||||
\f[B]cheat\f[] \[em] create and view command\-line cheatsheets
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\f[B]cheat\f[] [options] [\f[I]CHEATSHEET\f[]]
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
\f[B]cheat\f[] allows you to create and view interactive cheatsheets on
|
||||
the command\-line.
|
||||
It was designed to help remind *nix system administrators of options for
|
||||
commands that they use frequently, but not frequently enough to
|
||||
remember.
|
||||
.SH OPTIONS
|
||||
.TP
|
||||
.B \[en]init
|
||||
Print a config file to stdout.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-c, \[en]colorize
|
||||
Colorize output.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-d, \[en]directories
|
||||
List cheatsheet directories.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-e, \[en]edit=\f[I]CHEATSHEET\f[]
|
||||
Open \f[I]CHEATSHEET\f[] for editing.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-l, \[en]list
|
||||
List available cheatsheets.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-p, \[en]path=\f[I]PATH\f[]
|
||||
Filter only to sheets found on path \f[I]PATH\f[].
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-r, \[en]regex
|
||||
Treat search \f[I]PHRASE\f[] as a regular expression.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-s, \[en]search=\f[I]PHRASE\f[]
|
||||
Search cheatsheets for \f[I]PHRASE\f[].
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-t, \[en]tag=\f[I]TAG\f[]
|
||||
Filter only to sheets tagged with \f[I]TAG\f[].
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-T, \[en]tags
|
||||
List all tags in use.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \-v, \[en]version
|
||||
Print the version number.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \[en]rm=\f[I]CHEATSHEET\f[]
|
||||
Remove (deletes) \f[I]CHEATSHEET\f[].
|
||||
.RS
|
||||
.RE
|
||||
.SH EXAMPLES
|
||||
.TP
|
||||
.B To view the foo cheatsheet:
|
||||
cheat \f[I]foo\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To edit (or create) the foo cheatsheet:
|
||||
cheat \-e \f[I]foo\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To edit (or create) the foo/bar cheatsheet on the `work' cheatpath:
|
||||
cheat \-p \f[I]work\f[] \-e \f[I]foo/bar\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To view all cheatsheet directories:
|
||||
cheat \-d
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To list all available cheatsheets:
|
||||
cheat \-l
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To list all cheatsheets whose titles match `apt':
|
||||
cheat \-l \f[I]apt\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To list all tags in use:
|
||||
cheat \-T
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To list available cheatsheets that are tagged as `personal':
|
||||
cheat \-l \-t \f[I]personal\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To search for `ssh' among all cheatsheets, and colorize matches:
|
||||
cheat \-c \-s \f[I]ssh\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat \-c \-r \-s \f[I]`(?:[0\-9]{1,3}.){3}[0\-9]{1,3}'\f[]
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B To remove (delete) the foo/bar cheatsheet:
|
||||
cheat \[en]rm \f[I]foo/bar\f[]
|
||||
.RS
|
||||
.RE
|
||||
.SH FILES
|
||||
.SS Configuration
|
||||
.PP
|
||||
\f[B]cheat\f[] is configured via a YAML file that is conventionally
|
||||
named \f[I]conf.yaml\f[].
|
||||
\f[B]cheat\f[] will search for \f[I]conf.yaml\f[] in varying locations,
|
||||
depending upon your platform:
|
||||
.SS Linux, OSX, and other Unixes
|
||||
.IP "1." 3
|
||||
\f[B]CHEAT_CONFIG_PATH\f[]
|
||||
.IP "2." 3
|
||||
\f[B]XDG_CONFIG_HOME\f[]/cheat/conf.yaml
|
||||
.IP "3." 3
|
||||
\f[B]$HOME\f[]/.config/cheat/conf.yml
|
||||
.IP "4." 3
|
||||
\f[B]$HOME\f[]/.cheat/conf.yml
|
||||
.SS Windows
|
||||
.IP "1." 3
|
||||
\f[B]CHEAT_CONFIG_PATH\f[]
|
||||
.IP "2." 3
|
||||
\f[B]APPDATA\f[]/cheat/conf.yml
|
||||
.IP "3." 3
|
||||
\f[B]PROGRAMDATA\f[]/cheat/conf.yml
|
||||
.PP
|
||||
\f[B]cheat\f[] will search in the order specified above.
|
||||
The first \f[I]conf.yaml\f[] encountered will be respected.
|
||||
.PP
|
||||
If \f[B]cheat\f[] cannot locate a config file, it will ask if you'd like
|
||||
to generate one automatically.
|
||||
Alternatively, you may also generate a config file manually by running
|
||||
\f[B]cheat \[en]init\f[] and saving its output to the appropriate
|
||||
location for your platform.
|
||||
.SS Cheatpaths
|
||||
.PP
|
||||
\f[B]cheat\f[] reads its cheatsheets from \[lq]cheatpaths\[rq], which
|
||||
are the directories in which cheatsheets are stored.
|
||||
Cheatpaths may be configured in \f[I]conf.yaml\f[], and viewed via
|
||||
\f[B]cheat \-d\f[].
|
||||
.PP
|
||||
For detailed instructions on how to configure cheatpaths, please refer
|
||||
to the comments in conf.yml.
|
||||
.SS Autocompletion
|
||||
.PP
|
||||
Autocompletion scripts for \f[B]bash\f[], \f[B]zsh\f[], and
|
||||
\f[B]fish\f[] are available for download:
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.bash>
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.fish>
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.zsh>
|
||||
.PP
|
||||
The \f[B]bash\f[] and \f[B]zsh\f[] scripts provide optional integration
|
||||
with \f[B]fzf\f[], if the latter is available on your \f[B]PATH\f[].
|
||||
.PP
|
||||
The installation process will vary per system and shell configuration,
|
||||
and thus will not be discussed here.
|
||||
.SH ENVIRONMENT
|
||||
.TP
|
||||
.B \f[B]CHEAT_CONFIG_PATH\f[]
|
||||
The path at which the config file is available.
|
||||
If \f[B]CHEAT_CONFIG_PATH\f[] is set, all other config paths will be
|
||||
ignored.
|
||||
.RS
|
||||
.RE
|
||||
.TP
|
||||
.B \f[B]CHEAT_USE_FZF\f[]
|
||||
If set, autocompletion scripts will attempt to integrate with
|
||||
\f[B]fzf\f[].
|
||||
.RS
|
||||
.RE
|
||||
.SH RETURN VALUES
|
||||
.IP "0." 3
|
||||
Successful termination
|
||||
.IP "1." 3
|
||||
Application error
|
||||
.IP "2." 3
|
||||
Cheatsheet(s) not found
|
||||
.SH BUGS
|
||||
.PP
|
||||
See GitHub issues: <https://github.com/cheat/cheat/issues>
|
||||
.SH AUTHOR
|
||||
.PP
|
||||
Christopher Allen Lane <chris@chris-allen-lane.com>
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\f[B]fzf(1)\f[]
|
192
doc/cheat.1.md
Normal file
192
doc/cheat.1.md
Normal file
@ -0,0 +1,192 @@
|
||||
% CHEAT(1) | General Commands Manual
|
||||
|
||||
NAME
|
||||
====
|
||||
|
||||
**cheat** — create and view command-line cheatsheets
|
||||
|
||||
SYNOPSIS
|
||||
========
|
||||
|
||||
| **cheat** \[options] \[_CHEATSHEET_]
|
||||
|
||||
DESCRIPTION
|
||||
===========
|
||||
**cheat** allows you to create and view interactive cheatsheets on the
|
||||
command-line. It was designed to help remind \*nix system administrators of
|
||||
options for commands that they use frequently, but not frequently enough to
|
||||
remember.
|
||||
|
||||
OPTIONS
|
||||
=======
|
||||
|
||||
--init
|
||||
: Print a config file to stdout.
|
||||
|
||||
-c, --colorize
|
||||
: Colorize output.
|
||||
|
||||
-d, --directories
|
||||
: List cheatsheet directories.
|
||||
|
||||
-e, --edit=_CHEATSHEET_
|
||||
: Open _CHEATSHEET_ for editing.
|
||||
|
||||
-l, --list
|
||||
: List available cheatsheets.
|
||||
|
||||
-p, --path=_PATH_
|
||||
: Filter only to sheets found on path _PATH_.
|
||||
|
||||
-r, --regex
|
||||
: Treat search _PHRASE_ as a regular expression.
|
||||
|
||||
-s, --search=_PHRASE_
|
||||
: Search cheatsheets for _PHRASE_.
|
||||
|
||||
-t, --tag=_TAG_
|
||||
: Filter only to sheets tagged with _TAG_.
|
||||
|
||||
-T, --tags
|
||||
: List all tags in use.
|
||||
|
||||
-v, --version
|
||||
: Print the version number.
|
||||
|
||||
--rm=_CHEATSHEET_
|
||||
: Remove (deletes) _CHEATSHEET_.
|
||||
|
||||
|
||||
EXAMPLES
|
||||
========
|
||||
|
||||
To view the foo cheatsheet:
|
||||
: cheat _foo_
|
||||
|
||||
To edit (or create) the foo cheatsheet:
|
||||
: cheat -e _foo_
|
||||
|
||||
To edit (or create) the foo/bar cheatsheet on the 'work' cheatpath:
|
||||
: cheat -p _work_ -e _foo/bar_
|
||||
|
||||
To view all cheatsheet directories:
|
||||
: cheat -d
|
||||
|
||||
To list all available cheatsheets:
|
||||
: cheat -l
|
||||
|
||||
To list all cheatsheets whose titles match 'apt':
|
||||
: cheat -l _apt_
|
||||
|
||||
To list all tags in use:
|
||||
: cheat -T
|
||||
|
||||
To list available cheatsheets that are tagged as 'personal':
|
||||
: cheat -l -t _personal_
|
||||
|
||||
To search for 'ssh' among all cheatsheets, and colorize matches:
|
||||
: cheat -c -s _ssh_
|
||||
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
: cheat -c -r -s _'(?:[0-9]{1,3}\.){3}[0-9]{1,3}'_
|
||||
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
: cheat --rm _foo/bar_
|
||||
|
||||
|
||||
FILES
|
||||
=====
|
||||
|
||||
Configuration
|
||||
-------------
|
||||
**cheat** is configured via a YAML file that is conventionally named
|
||||
_conf.yaml_. **cheat** will search for _conf.yaml_ in varying locations,
|
||||
depending upon your platform:
|
||||
|
||||
### Linux, OSX, and other Unixes ###
|
||||
|
||||
1. **CHEAT_CONFIG_PATH**
|
||||
2. **XDG_CONFIG_HOME**/cheat/conf.yaml
|
||||
3. **$HOME**/.config/cheat/conf.yml
|
||||
4. **$HOME**/.cheat/conf.yml
|
||||
|
||||
### Windows ###
|
||||
|
||||
1. **CHEAT_CONFIG_PATH**
|
||||
2. **APPDATA**/cheat/conf.yml
|
||||
3. **PROGRAMDATA**/cheat/conf.yml
|
||||
|
||||
**cheat** will search in the order specified above. The first _conf.yaml_
|
||||
encountered will be respected.
|
||||
|
||||
If **cheat** cannot locate a config file, it will ask if you'd like to generate
|
||||
one automatically. Alternatively, you may also generate a config file manually
|
||||
by running **cheat --init** and saving its output to the appropriate location
|
||||
for your platform.
|
||||
|
||||
|
||||
Cheatpaths
|
||||
----------
|
||||
**cheat** reads its cheatsheets from "cheatpaths", which are the directories in
|
||||
which cheatsheets are stored. Cheatpaths may be configured in _conf.yaml_, and
|
||||
viewed via **cheat -d**.
|
||||
|
||||
For detailed instructions on how to configure cheatpaths, please refer to the
|
||||
comments in conf.yml.
|
||||
|
||||
|
||||
Autocompletion
|
||||
--------------
|
||||
Autocompletion scripts for **bash**, **zsh**, and **fish** are available for
|
||||
download:
|
||||
|
||||
- <https://github.com/cheat/cheat/blob/master/scripts/cheat.bash>
|
||||
- <https://github.com/cheat/cheat/blob/master/scripts/cheat.fish>
|
||||
- <https://github.com/cheat/cheat/blob/master/scripts/cheat.zsh>
|
||||
|
||||
The **bash** and **zsh** scripts provide optional integration with **fzf**, if
|
||||
the latter is available on your **PATH**.
|
||||
|
||||
The installation process will vary per system and shell configuration, and thus
|
||||
will not be discussed here.
|
||||
|
||||
|
||||
ENVIRONMENT
|
||||
===========
|
||||
|
||||
**CHEAT_CONFIG_PATH**
|
||||
|
||||
: The path at which the config file is available. If **CHEAT_CONFIG_PATH** is
|
||||
set, all other config paths will be ignored.
|
||||
|
||||
**CHEAT_USE_FZF**
|
||||
|
||||
: If set, autocompletion scripts will attempt to integrate with **fzf**.
|
||||
|
||||
RETURN VALUES
|
||||
=============
|
||||
|
||||
0. Successful termination
|
||||
|
||||
1. Application error
|
||||
|
||||
2. Cheatsheet(s) not found
|
||||
|
||||
|
||||
BUGS
|
||||
====
|
||||
|
||||
See GitHub issues: <https://github.com/cheat/cheat/issues>
|
||||
|
||||
|
||||
AUTHOR
|
||||
======
|
||||
|
||||
Christopher Allen Lane <chris@chris-allen-lane.com>
|
||||
|
||||
|
||||
SEE ALSO
|
||||
========
|
||||
|
||||
**fzf(1)**
|
||||
|
16
go.mod
16
go.mod
@ -1,14 +1,18 @@
|
||||
module github.com/cheat/cheat
|
||||
|
||||
go 1.13
|
||||
go 1.14
|
||||
|
||||
require (
|
||||
github.com/alecthomas/chroma v0.6.7
|
||||
github.com/alecthomas/chroma v0.9.1
|
||||
github.com/davecgh/go-spew v1.1.1
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815
|
||||
github.com/mgutz/ansi v0.0.0-20170206155736-9520e82c474b
|
||||
github.com/kr/text v0.2.0 // indirect
|
||||
github.com/mattn/go-isatty v0.0.13
|
||||
github.com/mitchellh/go-homedir v1.1.0
|
||||
github.com/tj/front v0.0.0-20170212063142-739be213b0a1
|
||||
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0 // indirect
|
||||
gopkg.in/yaml.v2 v2.2.4
|
||||
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e // indirect
|
||||
github.com/sergi/go-diff v1.1.0 // indirect
|
||||
golang.org/x/sys v0.0.0-20210608053332-aa57babbf139 // indirect
|
||||
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f // indirect
|
||||
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0
|
||||
gopkg.in/yaml.v2 v2.4.0
|
||||
)
|
||||
|
69
go.sum
69
go.sum
@ -1,62 +1,59 @@
|
||||
github.com/GeertJohan/go.incremental v1.0.0/go.mod h1:6fAjUhbVuX1KcMD3c8TEgVUqmo4seqhv0i0kdATSkM0=
|
||||
github.com/GeertJohan/go.rice v1.0.0/go.mod h1:eH6gbSOAUv07dQuZVnBmoDP8mgsM1rtixis4Tib9if0=
|
||||
github.com/akavel/rsrc v0.8.0/go.mod h1:uLoCtb9J+EyAqh+26kdrTgmzRBFPGOolLWKpdxkKq+c=
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
|
||||
github.com/alecthomas/chroma v0.6.7 h1:1hKci+AyKOxJrugR9veaocu9DQGR2/GecI72BpaO0Rg=
|
||||
github.com/alecthomas/chroma v0.6.7/go.mod h1:zVlgtbRS7BJDrDY9SB238RmpoCBCYFlLmcfZ3durxTk=
|
||||
github.com/alecthomas/chroma v0.9.1 h1:cBmvQqRImzR5aWqdMxYZByND4S7BCS/g0svZb28h0Dc=
|
||||
github.com/alecthomas/chroma v0.9.1/go.mod h1:eMuEnpA18XbG/WhOWtCzJHS7WqEtDAI+HxdwoW0nVSk=
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
|
||||
github.com/alecthomas/kong v0.1.17-0.20190424132513-439c674f7ae0/go.mod h1:+inYUSluD+p4L8KdviBSgzcqEjUQOfC5fQDRFuc36lI=
|
||||
github.com/alecthomas/kong v0.2.1-0.20190708041108-0548c6b1afae/go.mod h1:+inYUSluD+p4L8KdviBSgzcqEjUQOfC5fQDRFuc36lI=
|
||||
github.com/alecthomas/kong-hcl v0.1.8-0.20190615233001-b21fea9723c8/go.mod h1:MRgZdU3vrFd05IQ89AxUZ0aYdF39BYoNFa324SodPCA=
|
||||
github.com/alecthomas/kong v0.2.4/go.mod h1:kQOmtJgV+Lb4aj+I2LEn40cbtawdWJ9Y8QLq+lElKxE=
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
|
||||
github.com/daaku/go.zipexe v1.0.0/go.mod h1:z8IiR6TsVLEYKwXAoE/I+8ys/sDkgTzSL0CLnGVd57E=
|
||||
github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E=
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/dlclark/regexp2 v1.1.6 h1:CqB4MjHw0MFCDj+PHHjiESmHX+N7t0tJzKvC6M97BRg=
|
||||
github.com/dlclark/regexp2 v1.1.6/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
|
||||
github.com/dlclark/regexp2 v1.4.0 h1:F1rxgk7p4uKjwIQxBs9oAXe5CqrXlCduYEJvrF4u93E=
|
||||
github.com/dlclark/regexp2 v1.4.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815 h1:bWDMxwH3px2JBh6AyO7hdCn/PkvCZXii8TGj7sbtEbQ=
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815/go.mod h1:WwZ+bS3ebgob9U8Nd0kOddGdZWjyMGR8Wziv+TBNwSE=
|
||||
github.com/gorilla/csrf v1.6.0/go.mod h1:7tSf8kmjNYr7IWDCYhd3U8Ck34iQ/Yw5CJu7bAkHEGI=
|
||||
github.com/gorilla/handlers v1.4.1/go.mod h1:Qkdc/uu4tH4g6mTK6auzZ766c4CA0Ng8+o/OAirnOIQ=
|
||||
github.com/gorilla/mux v1.7.3/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
|
||||
github.com/gorilla/securecookie v1.1.1/go.mod h1:ra0sb63/xPlUeL+yeDciTfxMRAA+MP+HVt/4epWDjd4=
|
||||
github.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=
|
||||
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
|
||||
github.com/mattn/go-colorable v0.0.9 h1:UVL0vNpWh04HeJXV0KLcaT7r06gOH2l4OW6ddYRUIY4=
|
||||
github.com/mattn/go-colorable v0.0.9/go.mod h1:9vuHe8Xs5qXnSaW/c/ABM9alt+Vo+STaOChaDxuIBZU=
|
||||
github.com/mattn/go-isatty v0.0.4 h1:bnP0vzxcAdeI1zdubAl5PjU6zsERjGZb7raWodagDYs=
|
||||
github.com/mattn/go-isatty v0.0.4/go.mod h1:M+lRXTBqGeGNdLjl/ufCoiOlB5xdOkqRJdNxMWT7Zi4=
|
||||
github.com/mgutz/ansi v0.0.0-20170206155736-9520e82c474b h1:j7+1HpAFS1zy5+Q4qx1fWh90gTKwiN4QCGoY9TWyyO4=
|
||||
github.com/mgutz/ansi v0.0.0-20170206155736-9520e82c474b/go.mod h1:01TrycV0kFyexm33Z7vhZRXopbI8J3TDReVlkTgMUxE=
|
||||
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
|
||||
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
|
||||
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
||||
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||
github.com/mattn/go-colorable v0.1.6/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
|
||||
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
|
||||
github.com/mattn/go-isatty v0.0.13 h1:qdl+GuBjcsKKDco5BsxPJlId98mSWNKqYA+Co0SC1yA=
|
||||
github.com/mattn/go-isatty v0.0.13/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
|
||||
github.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=
|
||||
github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=
|
||||
github.com/mitchellh/mapstructure v1.1.2/go.mod h1:FVVH3fgwuzCH5S8UJGiWEs2h04kUh9fWfEaFds41c1Y=
|
||||
github.com/nkovacs/streamquote v0.0.0-20170412213628-49af9bddb229/go.mod h1:0aYXnNPJ8l7uZxf45rWW1a/uME32OF0rhiYGNQ2oF2E=
|
||||
github.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e h1:fD57ERR4JtEqsWbfPhv4DMiApHyliiK5xCTNVSPiaAs=
|
||||
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno=
|
||||
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
|
||||
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
|
||||
github.com/sergi/go-diff v1.1.0 h1:we8PVUC3FE2uYfodKH/nBHMSetSfHDR6scGdBi+erh0=
|
||||
github.com/sergi/go-diff v1.1.0/go.mod h1:STckp+ISIX8hZLjrqAeVduY0gWCT9IjLuqbuNXdaHfM=
|
||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||
github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
|
||||
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
|
||||
github.com/tj/front v0.0.0-20170212063142-739be213b0a1 h1:lA+aPRvltlx2fwv/BnxyYSDQo3pIeqzHgMO5GvK0T9E=
|
||||
github.com/tj/front v0.0.0-20170212063142-739be213b0a1/go.mod h1:deJrtusCTptAW4EUn5vBLpl3dhNqPqUwEjWJz5UNxpQ=
|
||||
github.com/valyala/bytebufferpool v1.0.0/go.mod h1:6bBcMArwyJ5K/AmCkWv1jt77kVWyCJ6HpOuEn7z0Csc=
|
||||
github.com/valyala/fasttemplate v1.0.1/go.mod h1:UQGH1tvbgY+Nz5t2n7tXsz52dQxojPUpymEIMZ47gx8=
|
||||
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35 h1:YAFjXN64LMvktoUZH9zgY4lGc/msGN7HQfoSuKCgaDU=
|
||||
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
|
||||
github.com/stretchr/testify v1.4.0 h1:2E4SXV/wtOkTonXsotYi4li6zVWxYlZuYNCXe9XRJyk=
|
||||
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
|
||||
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210608053332-aa57babbf139 h1:C+AwYEtBp/VQwoLntUmQ/yx3MS9vmZaKNdw5eOpoQe8=
|
||||
golang.org/x/sys v0.0.0-20210608053332-aa57babbf139/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f h1:BLraFXnmrev5lT+xlilqcH8XK9/i0At2xKjWk4p6zsU=
|
||||
gopkg.in/check.v1 v1.0.0-20200227125254-8fa46927fb4f/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0 h1:POO/ycCATvegFmVuPpQzZFJ+pGZeX22Ufu6fibxDVjU=
|
||||
gopkg.in/yaml.v1 v1.0.0-20140924161607-9f9df34309c0/go.mod h1:WDnlLJ4WF5VGsH/HVa3CI79GS0ol3YnhVnKP89i0kNg=
|
||||
gopkg.in/yaml.v2 v2.2.4 h1:/eiJrUcujPVeJ3xlSWaiNi3uSVmDGBK1pDHUHAnao1I=
|
||||
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
|
||||
gopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
|
||||
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
|
||||
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
|
||||
|
@ -2,8 +2,8 @@ package cheatpath
|
||||
|
||||
// Cheatpath encapsulates cheatsheet path information
|
||||
type Cheatpath struct {
|
||||
Name string `yaml:name`
|
||||
Path string `yaml:path`
|
||||
ReadOnly bool `yaml:readonly`
|
||||
Tags []string `yaml:tags`
|
||||
Name string `yaml:"name"`
|
||||
Path string `yaml:"path"`
|
||||
ReadOnly bool `yaml:"readonly"`
|
||||
Tags []string `yaml:"tags"`
|
||||
}
|
||||
|
26
internal/config/color.go
Normal file
26
internal/config/color.go
Normal file
@ -0,0 +1,26 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
|
||||
"github.com/mattn/go-isatty"
|
||||
)
|
||||
|
||||
// Color indicates whether colorization should be applied to the output
|
||||
func (c *Config) Color(opts map[string]interface{}) bool {
|
||||
|
||||
// default to the colorization specified in the configs...
|
||||
colorize := c.Colorize
|
||||
|
||||
// ... however, only apply colorization if we're writing to a tty...
|
||||
if !isatty.IsTerminal(os.Stdout.Fd()) && !isatty.IsCygwinTerminal(os.Stdout.Fd()) {
|
||||
colorize = false
|
||||
}
|
||||
|
||||
// ... *unless* the --colorize flag was passed
|
||||
if opts["--colorize"] == true {
|
||||
colorize = true
|
||||
}
|
||||
|
||||
return colorize
|
||||
}
|
22
internal/config/color_test.go
Normal file
22
internal/config/color_test.go
Normal file
@ -0,0 +1,22 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestColor asserts that colorization rules are properly respected
|
||||
func TestColor(t *testing.T) {
|
||||
|
||||
// mock a config
|
||||
conf := Config{}
|
||||
|
||||
opts := map[string]interface{}{"--colorize": false}
|
||||
if conf.Color(opts) {
|
||||
t.Errorf("failed to respect --colorize (false)")
|
||||
}
|
||||
|
||||
opts = map[string]interface{}{"--colorize": true}
|
||||
if !conf.Color(opts) {
|
||||
t.Errorf("failed to respect --colorize (true)")
|
||||
}
|
||||
}
|
@ -4,6 +4,8 @@ import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
cp "github.com/cheat/cheat/internal/cheatpath"
|
||||
|
||||
@ -13,15 +15,16 @@ import (
|
||||
|
||||
// Config encapsulates configuration parameters
|
||||
type Config struct {
|
||||
Colorize bool `yaml:colorize`
|
||||
Editor string `yaml:editor`
|
||||
Cheatpaths []cp.Cheatpath `yaml:cheatpaths`
|
||||
Style string `yaml:style`
|
||||
Formatter string `yaml:formatter`
|
||||
Colorize bool `yaml:"colorize"`
|
||||
Editor string `yaml:"editor"`
|
||||
Cheatpaths []cp.Cheatpath `yaml:"cheatpaths"`
|
||||
Style string `yaml:"style"`
|
||||
Formatter string `yaml:"formatter"`
|
||||
Pager string `yaml:"pager"`
|
||||
}
|
||||
|
||||
// New returns a new Config struct
|
||||
func New(opts map[string]interface{}, confPath string) (Config, error) {
|
||||
func New(opts map[string]interface{}, confPath string, resolve bool) (Config, error) {
|
||||
|
||||
// read the config file
|
||||
buf, err := ioutil.ReadFile(confPath)
|
||||
@ -38,14 +41,54 @@ func New(opts map[string]interface{}, confPath string) (Config, error) {
|
||||
return Config{}, fmt.Errorf("could not unmarshal yaml: %v", err)
|
||||
}
|
||||
|
||||
// expand ~ in config paths
|
||||
// if a .cheat directory exists locally, append it to the cheatpaths
|
||||
cwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
return Config{}, fmt.Errorf("failed to get cwd: %v", err)
|
||||
}
|
||||
|
||||
local := filepath.Join(cwd, ".cheat")
|
||||
if _, err := os.Stat(local); err == nil {
|
||||
path := cp.Cheatpath{
|
||||
Name: "cwd",
|
||||
Path: local,
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
}
|
||||
|
||||
conf.Cheatpaths = append(conf.Cheatpaths, path)
|
||||
}
|
||||
|
||||
// process cheatpaths
|
||||
for i, cheatpath := range conf.Cheatpaths {
|
||||
|
||||
// expand ~ in config paths
|
||||
expanded, err := homedir.Expand(cheatpath.Path)
|
||||
if err != nil {
|
||||
return Config{}, fmt.Errorf("failed to expand ~: %v", err)
|
||||
}
|
||||
|
||||
// follow symlinks
|
||||
//
|
||||
// NB: `resolve` is an ugly kludge that exists for the sake of unit-tests.
|
||||
// It's necessary because `EvalSymlinks` will error if the symlink points
|
||||
// to a non-existent location on the filesystem. When unit-testing,
|
||||
// however, we don't want to have dependencies on the filesystem. As such,
|
||||
// `resolve` is a switch that allows us to turn off symlink resolution when
|
||||
// running the config tests.
|
||||
if resolve {
|
||||
evaled, err := filepath.EvalSymlinks(expanded)
|
||||
if err != nil {
|
||||
return Config{}, fmt.Errorf(
|
||||
"failed to resolve symlink: %s: %v",
|
||||
expanded,
|
||||
err,
|
||||
)
|
||||
}
|
||||
|
||||
expanded = evaled
|
||||
}
|
||||
|
||||
conf.Cheatpaths[i].Path = expanded
|
||||
}
|
||||
|
||||
@ -70,5 +113,10 @@ func New(opts map[string]interface{}, confPath string) (Config, error) {
|
||||
conf.Formatter = "terminal16m"
|
||||
}
|
||||
|
||||
// if a pager was not provided, set a default
|
||||
if strings.TrimSpace(conf.Pager) == "" {
|
||||
conf.Pager = ""
|
||||
}
|
||||
|
||||
return conf, nil
|
||||
}
|
||||
|
@ -17,7 +17,7 @@ import (
|
||||
func TestConfigSuccessful(t *testing.T) {
|
||||
|
||||
// initialize a config
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/conf.yml"))
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/conf.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to parse config file: %v", err)
|
||||
}
|
||||
@ -69,7 +69,7 @@ func TestConfigSuccessful(t *testing.T) {
|
||||
func TestConfigFailure(t *testing.T) {
|
||||
|
||||
// attempt to read a non-existent config file
|
||||
_, err := New(map[string]interface{}{}, "/does-not-exit")
|
||||
_, err := New(map[string]interface{}{}, "/does-not-exit", false)
|
||||
if err == nil {
|
||||
t.Errorf("failed to error on unreadable config")
|
||||
}
|
||||
@ -84,14 +84,14 @@ func TestEmptyEditor(t *testing.T) {
|
||||
os.Setenv("EDITOR", "")
|
||||
|
||||
// initialize a config
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"))
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err == nil {
|
||||
t.Errorf("failed to return an error on empty editor")
|
||||
}
|
||||
|
||||
// set editor, and assert that it is respected
|
||||
os.Setenv("EDITOR", "foo")
|
||||
conf, err = New(map[string]interface{}{}, mock.Path("conf/empty.yml"))
|
||||
conf, err = New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to init configs: %v", err)
|
||||
}
|
||||
@ -101,7 +101,7 @@ func TestEmptyEditor(t *testing.T) {
|
||||
|
||||
// set visual, and assert that it overrides editor
|
||||
os.Setenv("VISUAL", "bar")
|
||||
conf, err = New(map[string]interface{}{}, mock.Path("conf/empty.yml"))
|
||||
conf, err = New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to init configs: %v", err)
|
||||
}
|
||||
|
24
internal/config/init.go
Normal file
24
internal/config/init.go
Normal file
@ -0,0 +1,24 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"path/filepath"
|
||||
)
|
||||
|
||||
// Init initializes a config file
|
||||
func Init(confpath string, configs string) error {
|
||||
|
||||
// assert that the config directory exists
|
||||
if err := os.MkdirAll(filepath.Dir(confpath), 0755); err != nil {
|
||||
return fmt.Errorf("failed to create directory: %v", err)
|
||||
}
|
||||
|
||||
// write the config file
|
||||
if err := ioutil.WriteFile(confpath, []byte(configs), 0644); err != nil {
|
||||
return fmt.Errorf("failed to create file: %v", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
38
internal/config/init_test.go
Normal file
38
internal/config/init_test.go
Normal file
@ -0,0 +1,38 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestInit asserts that configs are properly initialized
|
||||
func TestInit(t *testing.T) {
|
||||
|
||||
// initialize a temporary config file
|
||||
confFile, err := ioutil.TempFile("", "cheat-test")
|
||||
if err != nil {
|
||||
t.Errorf("failed to create temp file: %v", err)
|
||||
}
|
||||
|
||||
// clean up the temp file
|
||||
defer os.Remove(confFile.Name())
|
||||
|
||||
// initialize the config file
|
||||
conf := "mock config data"
|
||||
if err = Init(confFile.Name(), conf); err != nil {
|
||||
t.Errorf("failed to init config file: %v", err)
|
||||
}
|
||||
|
||||
// read back the config file contents
|
||||
bytes, err := ioutil.ReadFile(confFile.Name())
|
||||
if err != nil {
|
||||
t.Errorf("failed to read config file: %v", err)
|
||||
}
|
||||
|
||||
// assert that the contents were written correctly
|
||||
got := string(bytes)
|
||||
if got != conf {
|
||||
t.Errorf("failed to write configs: want: %s, got: %s", conf, got)
|
||||
}
|
||||
}
|
@ -3,58 +3,10 @@ package config
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path"
|
||||
|
||||
"github.com/mitchellh/go-homedir"
|
||||
)
|
||||
|
||||
// Path returns the config file path
|
||||
func Path(sys string) (string, error) {
|
||||
|
||||
var paths []string
|
||||
|
||||
// if CHEAT_CONFIG_PATH is set, return it
|
||||
if os.Getenv("CHEAT_CONFIG_PATH") != "" {
|
||||
|
||||
// expand ~
|
||||
expanded, err := homedir.Expand(os.Getenv("CHEAT_CONFIG_PATH"))
|
||||
if err != nil {
|
||||
return "", fmt.Errorf("failed to expand ~: %v", err)
|
||||
}
|
||||
|
||||
return expanded, nil
|
||||
|
||||
// OSX config paths
|
||||
} else if sys == "darwin" {
|
||||
|
||||
paths = []string{
|
||||
path.Join(os.Getenv("XDG_CONFIG_HOME"), "/cheat/conf.yml"),
|
||||
path.Join(os.Getenv("HOME"), ".config/cheat/conf.yml"),
|
||||
path.Join(os.Getenv("HOME"), ".cheat/conf.yml"),
|
||||
}
|
||||
|
||||
// Linux config paths
|
||||
} else if sys == "linux" {
|
||||
|
||||
paths = []string{
|
||||
path.Join(os.Getenv("XDG_CONFIG_HOME"), "/cheat/conf.yml"),
|
||||
path.Join(os.Getenv("HOME"), ".config/cheat/conf.yml"),
|
||||
path.Join(os.Getenv("HOME"), ".cheat/conf.yml"),
|
||||
"/etc/cheat/conf.yml",
|
||||
}
|
||||
|
||||
// Windows config paths
|
||||
} else if sys == "windows" {
|
||||
|
||||
paths = []string{
|
||||
fmt.Sprintf("%s/cheat/conf.yml", os.Getenv("APPDATA")),
|
||||
fmt.Sprintf("%s/cheat/conf.yml", os.Getenv("PROGRAMDATA")),
|
||||
}
|
||||
|
||||
// Unsupported platforms
|
||||
} else {
|
||||
return "", fmt.Errorf("unsupported os: %s", sys)
|
||||
}
|
||||
func Path(paths []string) (string, error) {
|
||||
|
||||
// check if the config file exists on any paths
|
||||
for _, p := range paths {
|
||||
|
53
internal/config/path_test.go
Normal file
53
internal/config/path_test.go
Normal file
@ -0,0 +1,53 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"io/ioutil"
|
||||
"os"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestPathConfigNotExists asserts that `Path` identifies non-existent config
|
||||
// files
|
||||
func TestPathConfigNotExists(t *testing.T) {
|
||||
|
||||
// package (invalid) cheatpaths
|
||||
paths := []string{"/cheat-test-conf-does-not-exist"}
|
||||
|
||||
// assert
|
||||
if _, err := Path(paths); err == nil {
|
||||
t.Errorf("failed to identify non-existent config file")
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
// TestPathConfigExists asserts that `Path` identifies existent config files
|
||||
func TestPathConfigExists(t *testing.T) {
|
||||
|
||||
// initialize a temporary config file
|
||||
confFile, err := ioutil.TempFile("", "cheat-test")
|
||||
if err != nil {
|
||||
t.Errorf("failed to create temp file: %v", err)
|
||||
}
|
||||
|
||||
// clean up the temp file
|
||||
defer os.Remove(confFile.Name())
|
||||
|
||||
// package cheatpaths
|
||||
paths := []string{
|
||||
"/cheat-test-conf-does-not-exist",
|
||||
confFile.Name(),
|
||||
}
|
||||
|
||||
// assert
|
||||
got, err := Path(paths)
|
||||
if err != nil {
|
||||
t.Errorf("failed to identify config file: %v", err)
|
||||
}
|
||||
if got != confFile.Name() {
|
||||
t.Errorf(
|
||||
"failed to return config path: want: %s, got: %s",
|
||||
confFile.Name(),
|
||||
got,
|
||||
)
|
||||
}
|
||||
}
|
54
internal/config/paths.go
Normal file
54
internal/config/paths.go
Normal file
@ -0,0 +1,54 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"path"
|
||||
|
||||
"github.com/mitchellh/go-homedir"
|
||||
)
|
||||
|
||||
// Paths returns config file paths that are appropriate for the operating
|
||||
// system
|
||||
func Paths(
|
||||
sys string,
|
||||
home string,
|
||||
envvars map[string]string,
|
||||
) ([]string, error) {
|
||||
|
||||
// if `CHEAT_CONFIG_PATH` is set, expand ~ and return it
|
||||
if confpath, ok := envvars["CHEAT_CONFIG_PATH"]; ok {
|
||||
|
||||
// expand ~
|
||||
expanded, err := homedir.Expand(confpath)
|
||||
if err != nil {
|
||||
return []string{}, fmt.Errorf("failed to expand ~: %v", err)
|
||||
}
|
||||
|
||||
return []string{expanded}, nil
|
||||
}
|
||||
|
||||
switch sys {
|
||||
case "darwin", "linux", "freebsd":
|
||||
paths := []string{}
|
||||
|
||||
// don't include the `XDG_CONFIG_HOME` path if that envvar is not set
|
||||
if xdgpath, ok := envvars["XDG_CONFIG_HOME"]; ok {
|
||||
paths = append(paths, path.Join(xdgpath, "/cheat/conf.yml"))
|
||||
}
|
||||
|
||||
paths = append(paths, []string{
|
||||
path.Join(home, ".config/cheat/conf.yml"),
|
||||
path.Join(home, ".cheat/conf.yml"),
|
||||
"/etc/cheat/conf.yml",
|
||||
}...)
|
||||
|
||||
return paths, nil
|
||||
case "windows":
|
||||
return []string{
|
||||
path.Join(envvars["APPDATA"], "/cheat/conf.yml"),
|
||||
path.Join(envvars["PROGRAMDATA"], "/cheat/conf.yml"),
|
||||
}, nil
|
||||
default:
|
||||
return []string{}, fmt.Errorf("unsupported os: %s", sys)
|
||||
}
|
||||
}
|
175
internal/config/paths_test.go
Normal file
175
internal/config/paths_test.go
Normal file
@ -0,0 +1,175 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"reflect"
|
||||
"testing"
|
||||
|
||||
"github.com/davecgh/go-spew/spew"
|
||||
)
|
||||
|
||||
// TestValidatePathsNix asserts that the proper config paths are returned on
|
||||
// *nix platforms
|
||||
func TestValidatePathsNix(t *testing.T) {
|
||||
|
||||
// mock the user's home directory
|
||||
home := "/home/foo"
|
||||
|
||||
// mock some envvars
|
||||
envvars := map[string]string{
|
||||
"XDG_CONFIG_HOME": "/home/bar",
|
||||
}
|
||||
|
||||
// specify the platforms to test
|
||||
oses := []string{
|
||||
"darwin",
|
||||
"freebsd",
|
||||
"linux",
|
||||
}
|
||||
|
||||
// test each *nix os
|
||||
for _, os := range oses {
|
||||
// get the paths for the platform
|
||||
paths, err := Paths(os, home, envvars)
|
||||
if err != nil {
|
||||
t.Errorf("paths returned an error: %v", err)
|
||||
}
|
||||
|
||||
// specify the expected output
|
||||
want := []string{
|
||||
"/home/bar/cheat/conf.yml",
|
||||
"/home/foo/.config/cheat/conf.yml",
|
||||
"/home/foo/.cheat/conf.yml",
|
||||
"/etc/cheat/conf.yml",
|
||||
}
|
||||
|
||||
// assert that output matches expectations
|
||||
if !reflect.DeepEqual(paths, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected paths: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(paths),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidatePathsNixNoXDG asserts that the proper config paths are returned
|
||||
// on *nix platforms when `XDG_CONFIG_HOME is not set
|
||||
func TestValidatePathsNixNoXDG(t *testing.T) {
|
||||
|
||||
// mock the user's home directory
|
||||
home := "/home/foo"
|
||||
|
||||
// mock some envvars
|
||||
envvars := map[string]string{}
|
||||
|
||||
// specify the platforms to test
|
||||
oses := []string{
|
||||
"darwin",
|
||||
"freebsd",
|
||||
"linux",
|
||||
}
|
||||
|
||||
// test each *nix os
|
||||
for _, os := range oses {
|
||||
// get the paths for the platform
|
||||
paths, err := Paths(os, home, envvars)
|
||||
if err != nil {
|
||||
t.Errorf("paths returned an error: %v", err)
|
||||
}
|
||||
|
||||
// specify the expected output
|
||||
want := []string{
|
||||
"/home/foo/.config/cheat/conf.yml",
|
||||
"/home/foo/.cheat/conf.yml",
|
||||
"/etc/cheat/conf.yml",
|
||||
}
|
||||
|
||||
// assert that output matches expectations
|
||||
if !reflect.DeepEqual(paths, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected paths: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(paths),
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidatePathsWindows asserts that the proper config paths are returned
|
||||
// on Windows platforms
|
||||
func TestValidatePathsWindows(t *testing.T) {
|
||||
|
||||
// mock the user's home directory
|
||||
home := "not-used-on-windows"
|
||||
|
||||
// mock some envvars
|
||||
envvars := map[string]string{
|
||||
"APPDATA": "/apps",
|
||||
"PROGRAMDATA": "/programs",
|
||||
}
|
||||
|
||||
// get the paths for the platform
|
||||
paths, err := Paths("windows", home, envvars)
|
||||
if err != nil {
|
||||
t.Errorf("paths returned an error: %v", err)
|
||||
}
|
||||
|
||||
// specify the expected output
|
||||
want := []string{
|
||||
"/apps/cheat/conf.yml",
|
||||
"/programs/cheat/conf.yml",
|
||||
}
|
||||
|
||||
// assert that output matches expectations
|
||||
if !reflect.DeepEqual(paths, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected paths: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(paths),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidatePathsUnsupported asserts that an error is returned on
|
||||
// unsupported platforms
|
||||
func TestValidatePathsUnsupported(t *testing.T) {
|
||||
_, err := Paths("unsupported", "", map[string]string{})
|
||||
if err == nil {
|
||||
t.Errorf("failed to return error on unsupported platform")
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidatePathsCheatConfigPath asserts that the proper config path is
|
||||
// returned when `CHEAT_CONFIG_PATH` is explicitly specified.
|
||||
func TestValidatePathsCheatConfigPath(t *testing.T) {
|
||||
|
||||
// mock the user's home directory
|
||||
home := "/home/foo"
|
||||
|
||||
// mock some envvars
|
||||
envvars := map[string]string{
|
||||
"XDG_CONFIG_HOME": "/home/bar",
|
||||
"CHEAT_CONFIG_PATH": "/home/baz/conf.yml",
|
||||
}
|
||||
|
||||
// get the paths for the platform
|
||||
paths, err := Paths("linux", home, envvars)
|
||||
if err != nil {
|
||||
t.Errorf("paths returned an error: %v", err)
|
||||
}
|
||||
|
||||
// specify the expected output
|
||||
want := []string{
|
||||
"/home/baz/conf.yml",
|
||||
}
|
||||
|
||||
// assert that output matches expectations
|
||||
if !reflect.DeepEqual(paths, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected paths: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(paths),
|
||||
)
|
||||
}
|
||||
}
|
18
internal/display/faint.go
Normal file
18
internal/display/faint.go
Normal file
@ -0,0 +1,18 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// Faint returns an faint string
|
||||
func Faint(str string, conf config.Config) string {
|
||||
// make `str` faint only if colorization has been requested
|
||||
if conf.Colorize {
|
||||
return fmt.Sprintf(fmt.Sprintf("\033[2m%s\033[0m", str))
|
||||
}
|
||||
|
||||
// otherwise, return the string unmodified
|
||||
return str
|
||||
}
|
27
internal/display/faint_test.go
Normal file
27
internal/display/faint_test.go
Normal file
@ -0,0 +1,27 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// TestFaint asserts that Faint applies faint formatting
|
||||
func TestFaint(t *testing.T) {
|
||||
|
||||
// case: apply colorization
|
||||
conf := config.Config{Colorize: true}
|
||||
want := "\033[2mfoo\033[0m"
|
||||
got := Faint("foo", conf)
|
||||
if want != got {
|
||||
t.Errorf("failed to faint: want: %s, got: %s", want, got)
|
||||
}
|
||||
|
||||
// case: do not apply colorization
|
||||
conf.Colorize = false
|
||||
want = "foo"
|
||||
got = Faint("foo", conf)
|
||||
if want != got {
|
||||
t.Errorf("failed to faint: want: %s, got: %s", want, got)
|
||||
}
|
||||
}
|
21
internal/display/indent.go
Normal file
21
internal/display/indent.go
Normal file
@ -0,0 +1,21 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Indent prepends each line of a string with a tab
|
||||
func Indent(str string) string {
|
||||
|
||||
// trim superfluous whitespace
|
||||
str = strings.TrimSpace(str)
|
||||
|
||||
// prepend each line with a tab character
|
||||
out := ""
|
||||
for _, line := range strings.Split(str, "\n") {
|
||||
out += fmt.Sprintf("\t%s\n", line)
|
||||
}
|
||||
|
||||
return out
|
||||
}
|
12
internal/display/indent_test.go
Normal file
12
internal/display/indent_test.go
Normal file
@ -0,0 +1,12 @@
|
||||
package display
|
||||
|
||||
import "testing"
|
||||
|
||||
// TestIndent asserts that Indent prepends a tab to each line
|
||||
func TestIndent(t *testing.T) {
|
||||
got := Indent("foo\nbar\nbaz")
|
||||
want := "\tfoo\n\tbar\n\tbaz\n"
|
||||
if got != want {
|
||||
t.Errorf("failed to indent: want: %s, got: %s", want, got)
|
||||
}
|
||||
}
|
8
internal/display/underline.go
Normal file
8
internal/display/underline.go
Normal file
@ -0,0 +1,8 @@
|
||||
package display
|
||||
|
||||
import "fmt"
|
||||
|
||||
// Underline returns an underlined string
|
||||
func Underline(str string) string {
|
||||
return fmt.Sprintf(fmt.Sprintf("\033[4m%s\033[0m", str))
|
||||
}
|
14
internal/display/underline_test.go
Normal file
14
internal/display/underline_test.go
Normal file
@ -0,0 +1,14 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestUnderline asserts that Underline applies underline formatting
|
||||
func TestUnderline(t *testing.T) {
|
||||
want := "\033[4mfoo\033[0m"
|
||||
got := Underline("foo")
|
||||
if want != got {
|
||||
t.Errorf("failed to underline: want: %s, got: %s", want, got)
|
||||
}
|
||||
}
|
37
internal/display/write.go
Normal file
37
internal/display/write.go
Normal file
@ -0,0 +1,37 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"strings"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// Write writes output either directly to stdout, or through a pager,
|
||||
// depending upon configuration.
|
||||
func Write(out string, conf config.Config) {
|
||||
// if no pager was configured, print the output to stdout and exit
|
||||
if conf.Pager == "" {
|
||||
fmt.Print(out)
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// otherwise, pipe output through the pager
|
||||
parts := strings.Split(conf.Pager, " ")
|
||||
pager := parts[0]
|
||||
args := parts[1:]
|
||||
|
||||
// run the pager
|
||||
cmd := exec.Command(pager, args...)
|
||||
cmd.Stdin = strings.NewReader(out)
|
||||
cmd.Stdout = os.Stdout
|
||||
|
||||
// handle errors
|
||||
err := cmd.Run()
|
||||
if err != nil {
|
||||
fmt.Fprintln(os.Stderr, fmt.Sprintf("failed to write to pager: %v", err))
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
44
internal/frontmatter/frontmatter.go
Normal file
44
internal/frontmatter/frontmatter.go
Normal file
@ -0,0 +1,44 @@
|
||||
package frontmatter
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
|
||||
"gopkg.in/yaml.v1"
|
||||
)
|
||||
|
||||
// Frontmatter encapsulates cheatsheet frontmatter data
|
||||
type Frontmatter struct {
|
||||
Tags []string
|
||||
Syntax string
|
||||
}
|
||||
|
||||
// Parse parses cheatsheet frontmatter
|
||||
func Parse(markdown string) (string, Frontmatter, error) {
|
||||
|
||||
// specify the frontmatter delimiter
|
||||
delim := "---"
|
||||
|
||||
// initialize a frontmatter struct
|
||||
var fm Frontmatter
|
||||
|
||||
// if the markdown does not contain frontmatter, pass it through unmodified
|
||||
if !strings.HasPrefix(markdown, delim) {
|
||||
return strings.TrimSpace(markdown), fm, nil
|
||||
}
|
||||
|
||||
// otherwise, split the frontmatter and cheatsheet text
|
||||
parts := strings.SplitN(markdown, delim, 3)
|
||||
|
||||
// return an error if the frontmatter parses into the wrong number of parts
|
||||
if len(parts) != 3 {
|
||||
return markdown, fm, fmt.Errorf("failed to delimit frontmatter")
|
||||
}
|
||||
|
||||
// return an error if the YAML cannot be unmarshalled
|
||||
if err := yaml.Unmarshal([]byte(parts[1]), &fm); err != nil {
|
||||
return markdown, fm, fmt.Errorf("failed to unmarshal frontmatter: %v", err)
|
||||
}
|
||||
|
||||
return strings.TrimSpace(parts[2]), fm, nil
|
||||
}
|
95
internal/frontmatter/frontmatter_test.go
Normal file
95
internal/frontmatter/frontmatter_test.go
Normal file
@ -0,0 +1,95 @@
|
||||
package frontmatter
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestHasFrontmatter asserts that markdown is properly parsed when it contains
|
||||
// frontmatter
|
||||
func TestHasFrontmatter(t *testing.T) {
|
||||
|
||||
// stub our cheatsheet content
|
||||
markdown := `---
|
||||
syntax: go
|
||||
tags: [ test ]
|
||||
---
|
||||
To foo the bar: baz`
|
||||
|
||||
// parse the frontmatter
|
||||
text, fm, err := Parse(markdown)
|
||||
|
||||
// assert expectations
|
||||
if err != nil {
|
||||
t.Errorf("failed to parse markdown: %v", err)
|
||||
}
|
||||
|
||||
want := "To foo the bar: baz"
|
||||
if text != want {
|
||||
t.Errorf("failed to parse text: want: %s, got: %s", want, text)
|
||||
}
|
||||
|
||||
want = "go"
|
||||
if fm.Syntax != want {
|
||||
t.Errorf("failed to parse syntax: want: %s, got: %s", want, fm.Syntax)
|
||||
}
|
||||
|
||||
want = "test"
|
||||
if fm.Tags[0] != want {
|
||||
t.Errorf("failed to parse tags: want: %s, got: %s", want, fm.Tags[0])
|
||||
}
|
||||
if len(fm.Tags) != 1 {
|
||||
t.Errorf("failed to parse tags: want: len 0, got: len %d", len(fm.Tags))
|
||||
}
|
||||
}
|
||||
|
||||
// TestHasFrontmatter asserts that markdown is properly parsed when it does not
|
||||
// contain frontmatter
|
||||
func TestHasNoFrontmatter(t *testing.T) {
|
||||
|
||||
// stub our cheatsheet content
|
||||
markdown := "To foo the bar: baz"
|
||||
|
||||
// parse the frontmatter
|
||||
text, fm, err := Parse(markdown)
|
||||
|
||||
// assert expectations
|
||||
if err != nil {
|
||||
t.Errorf("failed to parse markdown: %v", err)
|
||||
}
|
||||
|
||||
if text != markdown {
|
||||
t.Errorf("failed to parse text: want: %s, got: %s", markdown, text)
|
||||
}
|
||||
|
||||
if fm.Syntax != "" {
|
||||
t.Errorf("failed to parse syntax: want: '', got: %s", fm.Syntax)
|
||||
}
|
||||
|
||||
if len(fm.Tags) != 0 {
|
||||
t.Errorf("failed to parse tags: want: len 0, got: len %d", len(fm.Tags))
|
||||
}
|
||||
}
|
||||
|
||||
// TestHasInvalidFrontmatter asserts that markdown is properly parsed when it
|
||||
// contains invalid frontmatter
|
||||
func TestHasInvalidFrontmatter(t *testing.T) {
|
||||
|
||||
// stub our cheatsheet content (with invalid frontmatter)
|
||||
markdown := `---
|
||||
syntax: go
|
||||
tags: [ test ]
|
||||
To foo the bar: baz`
|
||||
|
||||
// parse the frontmatter
|
||||
text, _, err := Parse(markdown)
|
||||
|
||||
// assert that an error was returned
|
||||
if err == nil {
|
||||
t.Error("failed to error on invalid frontmatter")
|
||||
}
|
||||
|
||||
// assert that the "raw" markdown was returned
|
||||
if text != markdown {
|
||||
t.Errorf("failed to parse text: want: %s, got: %s", markdown, text)
|
||||
}
|
||||
}
|
24
internal/installer/clone.go
Normal file
24
internal/installer/clone.go
Normal file
@ -0,0 +1,24 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
)
|
||||
|
||||
const cloneURL = "https://github.com/cheat/cheatsheets.git"
|
||||
|
||||
// clone clones the community cheatsheets
|
||||
func clone(path string) error {
|
||||
|
||||
// perform the clone in a shell
|
||||
cmd := exec.Command("git", "clone", cloneURL, path)
|
||||
cmd.Stdout = os.Stdout
|
||||
cmd.Stderr = os.Stderr
|
||||
err := cmd.Run()
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to clone cheatsheets: %v", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
37
internal/installer/prompt.go
Normal file
37
internal/installer/prompt.go
Normal file
@ -0,0 +1,37 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Prompt prompts the user for a answer
|
||||
func Prompt(prompt string, def bool) (bool, error) {
|
||||
|
||||
// initialize a line reader
|
||||
reader := bufio.NewReader(os.Stdin)
|
||||
|
||||
// display the prompt
|
||||
fmt.Print(fmt.Sprintf("%s: ", prompt))
|
||||
|
||||
// read the answer
|
||||
ans, err := reader.ReadString('\n')
|
||||
if err != nil {
|
||||
return false, fmt.Errorf("failed to parse input: %v", err)
|
||||
}
|
||||
|
||||
// normalize the answer
|
||||
ans = strings.ToLower(strings.TrimRight(ans, "\n"))
|
||||
|
||||
// return the appropriate response
|
||||
switch ans {
|
||||
case "y":
|
||||
return true, nil
|
||||
case "":
|
||||
return def, nil
|
||||
default:
|
||||
return false, nil
|
||||
}
|
||||
}
|
55
internal/installer/run.go
Normal file
55
internal/installer/run.go
Normal file
@ -0,0 +1,55 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path"
|
||||
"strings"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// Run runs the installer
|
||||
func Run(configs string, confpath string) error {
|
||||
|
||||
// determine the appropriate paths for config data and (optional) community
|
||||
// cheatsheets based on the user's platform
|
||||
confdir := path.Dir(confpath)
|
||||
|
||||
// create paths for community and personal cheatsheets
|
||||
community := path.Join(confdir, "/cheatsheets/community")
|
||||
personal := path.Join(confdir, "/cheatsheets/personal")
|
||||
|
||||
// template the above paths into the default configs
|
||||
configs = strings.Replace(configs, "COMMUNITY_PATH", community, -1)
|
||||
configs = strings.Replace(configs, "PERSONAL_PATH", personal, -1)
|
||||
|
||||
// prompt the user to download the community cheatsheets
|
||||
yes, err := Prompt(
|
||||
"Would you like to download the community cheatsheets? [Y/n]",
|
||||
true,
|
||||
)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to prompt: %v", err)
|
||||
}
|
||||
|
||||
// clone the community cheatsheets if so instructed
|
||||
if yes {
|
||||
// clone the community cheatsheets
|
||||
if err := clone(community); err != nil {
|
||||
return fmt.Errorf("failed to clone cheatsheets: %v", err)
|
||||
}
|
||||
|
||||
// also create a directory for personal cheatsheets
|
||||
if err := os.MkdirAll(personal, os.ModePerm); err != nil {
|
||||
return fmt.Errorf("failed to create directory: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// the config file does not exist, so we'll try to create one
|
||||
if err = config.Init(confpath, configs); err != nil {
|
||||
return fmt.Errorf("failed to create config file: %v", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
@ -13,7 +13,7 @@ func Path(filename string) string {
|
||||
// determine the path of this file during runtime
|
||||
_, thisfile, _, _ := runtime.Caller(0)
|
||||
|
||||
// compute the config path
|
||||
// compute the mock path
|
||||
file, err := filepath.Abs(
|
||||
path.Join(
|
||||
filepath.Dir(thisfile),
|
||||
@ -22,7 +22,7 @@ func Path(filename string) string {
|
||||
),
|
||||
)
|
||||
if err != nil {
|
||||
panic(fmt.Errorf("failed to resolve config path: %v", err))
|
||||
panic(fmt.Errorf("failed to resolve mock path: %v", err))
|
||||
}
|
||||
|
||||
return file
|
||||
|
37
internal/sheet/colorize.go
Normal file
37
internal/sheet/colorize.go
Normal file
@ -0,0 +1,37 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
|
||||
"github.com/alecthomas/chroma/quick"
|
||||
)
|
||||
|
||||
// Colorize applies syntax-highlighting to a cheatsheet's Text.
|
||||
func (s *Sheet) Colorize(conf config.Config) {
|
||||
|
||||
// if the syntax was not specified, default to bash
|
||||
lex := s.Syntax
|
||||
if lex == "" {
|
||||
lex = "bash"
|
||||
}
|
||||
|
||||
// write colorized text into a buffer
|
||||
var buf bytes.Buffer
|
||||
err := quick.Highlight(
|
||||
&buf,
|
||||
s.Text,
|
||||
lex,
|
||||
conf.Formatter,
|
||||
conf.Style,
|
||||
)
|
||||
|
||||
// if colorization somehow failed, do nothing
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
|
||||
// otherwise, swap the cheatsheet's Text with its colorized equivalent
|
||||
s.Text = buf.String()
|
||||
}
|
34
internal/sheet/colorize_test.go
Normal file
34
internal/sheet/colorize_test.go
Normal file
@ -0,0 +1,34 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// TestColorize asserts that syntax-highlighting is correctly applied
|
||||
func TestColorize(t *testing.T) {
|
||||
|
||||
// mock configs
|
||||
conf := config.Config{
|
||||
Formatter: "terminal16m",
|
||||
Style: "solarized-dark",
|
||||
}
|
||||
|
||||
// mock a sheet
|
||||
s := Sheet{
|
||||
Text: "echo 'foo'",
|
||||
}
|
||||
|
||||
// colorize the sheet text
|
||||
s.Colorize(conf)
|
||||
|
||||
// initialize expectations
|
||||
want := "[38;2;181;137;0mecho[0m[38;2;147;161;161m"
|
||||
want += " [0m[38;2;42;161;152m'foo'[0m"
|
||||
|
||||
// assert
|
||||
if s.Text != want {
|
||||
t.Errorf("failed to colorize sheet: want: %s, got: %s", want, s.Text)
|
||||
}
|
||||
}
|
@ -25,7 +25,7 @@ func TestCopyFlat(t *testing.T) {
|
||||
}
|
||||
|
||||
// mock a cheatsheet struct
|
||||
sheet, err := New("foo", src.Name(), []string{}, false)
|
||||
sheet, err := New("foo", "community", src.Name(), []string{}, false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to init cheatsheet: %v", err)
|
||||
}
|
||||
@ -72,7 +72,13 @@ func TestCopyDeep(t *testing.T) {
|
||||
}
|
||||
|
||||
// mock a cheatsheet struct
|
||||
sheet, err := New("/cheat-tests/alpha/bravo/foo", src.Name(), []string{}, false)
|
||||
sheet, err := New(
|
||||
"/cheat-tests/alpha/bravo/foo",
|
||||
"community",
|
||||
src.Name(),
|
||||
[]string{},
|
||||
false,
|
||||
)
|
||||
if err != nil {
|
||||
t.Errorf("failed to init cheatsheet: %v", err)
|
||||
}
|
||||
|
@ -1,7 +0,0 @@
|
||||
package sheet
|
||||
|
||||
// Match encapsulates search matches within cheatsheets
|
||||
type Match struct {
|
||||
Line int
|
||||
Text string
|
||||
}
|
@ -3,43 +3,22 @@ package sheet
|
||||
import (
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"github.com/mgutz/ansi"
|
||||
)
|
||||
|
||||
// Search searches for regexp matches in a cheatsheet's text, and optionally
|
||||
// colorizes matching strings.
|
||||
func (s *Sheet) Search(reg *regexp.Regexp, colorize bool) []Match {
|
||||
// Search returns lines within a sheet's Text that match the search regex
|
||||
func (s *Sheet) Search(reg *regexp.Regexp) string {
|
||||
|
||||
// record matches
|
||||
matches := []Match{}
|
||||
matches := ""
|
||||
|
||||
// search through the cheatsheet's text line by line
|
||||
// TODO: searching line-by-line is surely the "naive" approach. Revisit this
|
||||
// later with an eye for performance improvements.
|
||||
for linenum, line := range strings.Split(s.Text, "\n") {
|
||||
for _, line := range strings.Split(s.Text, "\n\n") {
|
||||
|
||||
// exit early if the line doesn't match the regex
|
||||
if !reg.MatchString(line) {
|
||||
continue
|
||||
if reg.MatchString(line) {
|
||||
matches += line + "\n\n"
|
||||
}
|
||||
|
||||
// init the match
|
||||
m := Match{
|
||||
Line: linenum + 1,
|
||||
Text: strings.TrimSpace(line),
|
||||
}
|
||||
|
||||
// colorize the matching text if so configured
|
||||
if colorize {
|
||||
m.Text = reg.ReplaceAllStringFunc(m.Text, func(matched string) string {
|
||||
return ansi.Color(matched, "red+b")
|
||||
})
|
||||
}
|
||||
|
||||
// record the match
|
||||
matches = append(matches, m)
|
||||
}
|
||||
|
||||
return matches
|
||||
return strings.TrimSpace(matches)
|
||||
}
|
||||
|
@ -4,8 +4,6 @@ import (
|
||||
"reflect"
|
||||
"regexp"
|
||||
"testing"
|
||||
|
||||
"github.com/davecgh/go-spew/spew"
|
||||
)
|
||||
|
||||
// TestSearchNoMatch ensures that the expected output is returned when no
|
||||
@ -24,21 +22,21 @@ func TestSearchNoMatch(t *testing.T) {
|
||||
}
|
||||
|
||||
// search the sheet
|
||||
matches := sheet.Search(reg, false)
|
||||
matches := sheet.Search(reg)
|
||||
|
||||
// assert that no matches were found
|
||||
if len(matches) != 0 {
|
||||
t.Errorf("failure: expected no matches: got: %s", spew.Sdump(matches))
|
||||
if matches != "" {
|
||||
t.Errorf("failure: expected no matches: got: %s", matches)
|
||||
}
|
||||
}
|
||||
|
||||
// TestSearchSingleMatchNoColor asserts that the expected output is returned
|
||||
// when a single match is returned, and no colorization is applied.
|
||||
func TestSearchSingleMatchNoColor(t *testing.T) {
|
||||
// TestSearchSingleMatch asserts that the expected output is returned
|
||||
// when a single match is returned
|
||||
func TestSearchSingleMatch(t *testing.T) {
|
||||
|
||||
// mock a cheatsheet
|
||||
sheet := Sheet{
|
||||
Text: "The quick brown fox\njumped over\nthe lazy dog.",
|
||||
Text: "The quick brown fox\njumped over\n\nthe lazy dog.",
|
||||
}
|
||||
|
||||
// compile the search regex
|
||||
@ -48,69 +46,28 @@ func TestSearchSingleMatchNoColor(t *testing.T) {
|
||||
}
|
||||
|
||||
// search the sheet
|
||||
matches := sheet.Search(reg, false)
|
||||
matches := sheet.Search(reg)
|
||||
|
||||
// specify the expected results
|
||||
want := []Match{
|
||||
Match{
|
||||
Line: 1,
|
||||
Text: "The quick brown fox",
|
||||
},
|
||||
}
|
||||
want := "The quick brown fox\njumped over"
|
||||
|
||||
// assert that the correct matches were returned
|
||||
if !reflect.DeepEqual(matches, want) {
|
||||
if matches != want {
|
||||
t.Errorf(
|
||||
"failed to return expected matches: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(matches),
|
||||
want,
|
||||
matches,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// TestSearchSingleMatchColorized asserts that the expected output is returned
|
||||
// when a single match is returned, and colorization is applied
|
||||
func TestSearchSingleMatchColorized(t *testing.T) {
|
||||
// TestSearchMultiMatch asserts that the expected output is returned
|
||||
// when a multiple matches are returned
|
||||
func TestSearchMultiMatch(t *testing.T) {
|
||||
|
||||
// mock a cheatsheet
|
||||
sheet := Sheet{
|
||||
Text: "The quick brown fox\njumped over\nthe lazy dog.",
|
||||
}
|
||||
|
||||
// compile the search regex
|
||||
reg, err := regexp.Compile("(?i)fox")
|
||||
if err != nil {
|
||||
t.Errorf("failed to compile regex: %v", err)
|
||||
}
|
||||
|
||||
// search the sheet
|
||||
matches := sheet.Search(reg, true)
|
||||
|
||||
// specify the expected results
|
||||
want := []Match{
|
||||
Match{
|
||||
Line: 1,
|
||||
Text: "The quick brown \x1b[1;31mfox\x1b[0m",
|
||||
},
|
||||
}
|
||||
|
||||
// assert that the correct matches were returned
|
||||
if !reflect.DeepEqual(matches, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected matches: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(matches),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// TestSearchMultiMatchNoColor asserts that the expected output is returned
|
||||
// when a multiple matches are returned, and no colorization is applied
|
||||
func TestSearchMultiMatchNoColor(t *testing.T) {
|
||||
|
||||
// mock a cheatsheet
|
||||
sheet := Sheet{
|
||||
Text: "The quick brown fox\njumped over\nthe lazy dog.",
|
||||
Text: "The quick brown fox\n\njumped over\n\nthe lazy dog.",
|
||||
}
|
||||
|
||||
// compile the search regex
|
||||
@ -120,66 +77,17 @@ func TestSearchMultiMatchNoColor(t *testing.T) {
|
||||
}
|
||||
|
||||
// search the sheet
|
||||
matches := sheet.Search(reg, false)
|
||||
matches := sheet.Search(reg)
|
||||
|
||||
// specify the expected results
|
||||
want := []Match{
|
||||
Match{
|
||||
Line: 1,
|
||||
Text: "The quick brown fox",
|
||||
},
|
||||
Match{
|
||||
Line: 3,
|
||||
Text: "the lazy dog.",
|
||||
},
|
||||
}
|
||||
want := "The quick brown fox\n\nthe lazy dog."
|
||||
|
||||
// assert that the correct matches were returned
|
||||
if !reflect.DeepEqual(matches, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected matches: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(matches),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// TestSearchMultiMatchColorized asserts that the expected output is returned
|
||||
// when a multiple matches are returned, and colorization is applied
|
||||
func TestSearchMultiMatchColorized(t *testing.T) {
|
||||
|
||||
// mock a cheatsheet
|
||||
sheet := Sheet{
|
||||
Text: "The quick brown fox\njumped over\nthe lazy dog.",
|
||||
}
|
||||
|
||||
// compile the search regex
|
||||
reg, err := regexp.Compile("(?i)the")
|
||||
if err != nil {
|
||||
t.Errorf("failed to compile regex: %v", err)
|
||||
}
|
||||
|
||||
// search the sheet
|
||||
matches := sheet.Search(reg, true)
|
||||
|
||||
// specify the expected results
|
||||
want := []Match{
|
||||
Match{
|
||||
Line: 1,
|
||||
Text: "\x1b[1;31mThe\x1b[0m quick brown fox",
|
||||
},
|
||||
Match{
|
||||
Line: 3,
|
||||
Text: "\x1b[1;31mthe\x1b[0m lazy dog.",
|
||||
},
|
||||
}
|
||||
|
||||
// assert that the correct matches were returned
|
||||
if !reflect.DeepEqual(matches, want) {
|
||||
t.Errorf(
|
||||
"failed to return expected matches: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(matches),
|
||||
want,
|
||||
matches,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
@ -4,30 +4,25 @@ import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/tj/front"
|
||||
"github.com/cheat/cheat/internal/frontmatter"
|
||||
)
|
||||
|
||||
// frontmatter is an un-exported helper struct used in parsing cheatsheets
|
||||
type frontmatter struct {
|
||||
Tags []string
|
||||
Syntax string
|
||||
}
|
||||
|
||||
// Sheet encapsulates sheet information
|
||||
type Sheet struct {
|
||||
Title string
|
||||
Path string
|
||||
Text string
|
||||
Tags []string
|
||||
Syntax string
|
||||
ReadOnly bool
|
||||
Title string
|
||||
CheatPath string
|
||||
Path string
|
||||
Text string
|
||||
Tags []string
|
||||
Syntax string
|
||||
ReadOnly bool
|
||||
}
|
||||
|
||||
// New initializes a new Sheet
|
||||
func New(
|
||||
title string,
|
||||
cheatpath string,
|
||||
path string,
|
||||
tags []string,
|
||||
readOnly bool,
|
||||
@ -39,9 +34,8 @@ func New(
|
||||
return Sheet{}, fmt.Errorf("failed to read file: %s, %v", path, err)
|
||||
}
|
||||
|
||||
// parse the front-matter
|
||||
var fm frontmatter
|
||||
text, err := front.Unmarshal(markdown, &fm)
|
||||
// parse the cheatsheet frontmatter
|
||||
text, fm, err := frontmatter.Parse(string(markdown))
|
||||
if err != nil {
|
||||
return Sheet{}, fmt.Errorf("failed to parse front-matter: %v", err)
|
||||
}
|
||||
@ -54,11 +48,12 @@ func New(
|
||||
|
||||
// initialize and return a sheet
|
||||
return Sheet{
|
||||
Title: title,
|
||||
Path: path,
|
||||
Text: strings.TrimSpace(string(text)) + "\n",
|
||||
Tags: tags,
|
||||
Syntax: fm.Syntax,
|
||||
ReadOnly: readOnly,
|
||||
Title: title,
|
||||
CheatPath: cheatpath,
|
||||
Path: path,
|
||||
Text: text + "\n",
|
||||
Tags: tags,
|
||||
Syntax: fm.Syntax,
|
||||
ReadOnly: readOnly,
|
||||
}, nil
|
||||
}
|
||||
|
@ -13,6 +13,7 @@ func TestSheetSuccess(t *testing.T) {
|
||||
// initialize a sheet
|
||||
sheet, err := New(
|
||||
"foo",
|
||||
"community",
|
||||
mock.Path("sheet/foo"),
|
||||
[]string{"alpha", "bravo"},
|
||||
false,
|
||||
@ -61,6 +62,7 @@ func TestSheetFailure(t *testing.T) {
|
||||
// initialize a sheet
|
||||
_, err := New(
|
||||
"foo",
|
||||
"community",
|
||||
mock.Path("/does-not-exist"),
|
||||
[]string{"alpha", "bravo"},
|
||||
false,
|
||||
@ -69,3 +71,20 @@ func TestSheetFailure(t *testing.T) {
|
||||
t.Errorf("failed to return an error on unreadable sheet")
|
||||
}
|
||||
}
|
||||
|
||||
// TestSheetFrontMatterFailure asserts that an error is returned if the sheet's
|
||||
// frontmatter cannot be parsed.
|
||||
func TestSheetFrontMatterFailure(t *testing.T) {
|
||||
|
||||
// initialize a sheet
|
||||
_, err := New(
|
||||
"foo",
|
||||
"community",
|
||||
mock.Path("sheet/bad-fm"),
|
||||
[]string{"alpha", "bravo"},
|
||||
false,
|
||||
)
|
||||
if err == nil {
|
||||
t.Errorf("failed to return an error on malformed front-matter")
|
||||
}
|
||||
}
|
||||
|
@ -33,7 +33,7 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
|
||||
// fail if an error occurred while walking the directory
|
||||
if err != nil {
|
||||
return fmt.Errorf("error walking path: %v", err)
|
||||
return fmt.Errorf("failed to walk path: %v", err)
|
||||
}
|
||||
|
||||
// don't register directories as cheatsheets
|
||||
@ -45,18 +45,34 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
// accessed. Eg: `cheat tar` - `tar` is the title)
|
||||
title := strings.TrimPrefix(
|
||||
strings.TrimPrefix(path, cheatpath.Path),
|
||||
"/",
|
||||
string(os.PathSeparator),
|
||||
)
|
||||
|
||||
// ignore dotfiles. Otherwise, we'll likely load .git/*
|
||||
if strings.HasPrefix(title, ".") {
|
||||
// ignore hidden files and directories. Otherwise, we'll likely load
|
||||
// .git/* and .DS_Store.
|
||||
//
|
||||
// NB: this is still somewhat brittle in that it will miss files
|
||||
// contained within hidden directories in the middle of a path, though
|
||||
// that should not realistically occur.
|
||||
if strings.HasPrefix(title, ".") || strings.HasPrefix(info.Name(), ".") {
|
||||
return nil
|
||||
}
|
||||
|
||||
// parse the cheatsheet file into a `sheet` struct
|
||||
s, err := sheet.New(title, path, cheatpath.Tags, cheatpath.ReadOnly)
|
||||
s, err := sheet.New(
|
||||
title,
|
||||
cheatpath.Name,
|
||||
path,
|
||||
cheatpath.Tags,
|
||||
cheatpath.ReadOnly,
|
||||
)
|
||||
if err != nil {
|
||||
return fmt.Errorf("could not create sheet: %v", err)
|
||||
return fmt.Errorf(
|
||||
"failed to load sheet: %s, path: %s, err: %v",
|
||||
title,
|
||||
path,
|
||||
err,
|
||||
)
|
||||
}
|
||||
|
||||
// register the cheatsheet on its cheatpath, keyed by its title
|
||||
|
@ -1,3 +1,62 @@
|
||||
package sheets
|
||||
|
||||
// TODO
|
||||
import (
|
||||
"path"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/cheatpath"
|
||||
"github.com/cheat/cheat/internal/mock"
|
||||
)
|
||||
|
||||
// TestLoad asserts that sheets on valid cheatpaths can be loaded successfully
|
||||
func TestLoad(t *testing.T) {
|
||||
|
||||
// mock cheatpaths
|
||||
cheatpaths := []cheatpath.Cheatpath{
|
||||
{
|
||||
Name: "community",
|
||||
Path: path.Join(mock.Path("cheatsheets"), "community"),
|
||||
ReadOnly: true,
|
||||
},
|
||||
{
|
||||
Name: "personal",
|
||||
Path: path.Join(mock.Path("cheatsheets"), "personal"),
|
||||
ReadOnly: false,
|
||||
},
|
||||
}
|
||||
|
||||
// load cheatsheets
|
||||
sheets, err := Load(cheatpaths)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load cheatsheets: %v", err)
|
||||
}
|
||||
|
||||
// assert that the correct number of sheets loaded
|
||||
// (sheet load details are tested in `sheet_test.go`)
|
||||
want := 4
|
||||
if len(sheets) != want {
|
||||
t.Errorf(
|
||||
"failed to load correct number of cheatsheets: want: %d, got: %d",
|
||||
want,
|
||||
len(sheets),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// TestLoadBadPath asserts that an error is returned if a cheatpath is invalid
|
||||
func TestLoadBadPath(t *testing.T) {
|
||||
|
||||
// mock a bad cheatpath
|
||||
cheatpaths := []cheatpath.Cheatpath{
|
||||
{
|
||||
Name: "badpath",
|
||||
Path: "/cheat/test/path/does/not/exist",
|
||||
ReadOnly: true,
|
||||
},
|
||||
}
|
||||
|
||||
// attempt to load the cheatpath
|
||||
if _, err := Load(cheatpaths); err == nil {
|
||||
t.Errorf("failed to reject invalid cheatpath")
|
||||
}
|
||||
}
|
||||
|
36
internal/sheets/tags.go
Normal file
36
internal/sheets/tags.go
Normal file
@ -0,0 +1,36 @@
|
||||
package sheets
|
||||
|
||||
import (
|
||||
"sort"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
|
||||
// Tags returns a slice of all tags in use in any sheet
|
||||
func Tags(cheatpaths []map[string]sheet.Sheet) []string {
|
||||
|
||||
// create a map of all tags in use in any sheet
|
||||
tags := make(map[string]bool)
|
||||
|
||||
// iterate over all tags on all sheets on all cheatpaths
|
||||
for _, path := range cheatpaths {
|
||||
for _, sheet := range path {
|
||||
for _, tag := range sheet.Tags {
|
||||
tags[tag] = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// restructure the map into a slice
|
||||
sorted := []string{}
|
||||
for tag := range tags {
|
||||
sorted = append(sorted, tag)
|
||||
}
|
||||
|
||||
// sort the slice
|
||||
sort.Slice(sorted, func(i, j int) bool {
|
||||
return sorted[i] < sorted[j]
|
||||
})
|
||||
|
||||
return sorted
|
||||
}
|
51
internal/sheets/tags_test.go
Normal file
51
internal/sheets/tags_test.go
Normal file
@ -0,0 +1,51 @@
|
||||
package sheets
|
||||
|
||||
import (
|
||||
"reflect"
|
||||
"testing"
|
||||
|
||||
"github.com/davecgh/go-spew/spew"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
|
||||
// TestTags asserts that cheatsheet tags are properly returned
|
||||
func TestTags(t *testing.T) {
|
||||
|
||||
// mock cheatsheets available on multiple cheatpaths
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
|
||||
// mock community cheatsheets
|
||||
map[string]sheet.Sheet{
|
||||
"foo": sheet.Sheet{Title: "foo", Tags: []string{"alpha"}},
|
||||
"bar": sheet.Sheet{Title: "bar", Tags: []string{"alpha", "bravo"}},
|
||||
},
|
||||
|
||||
// mock local cheatsheets
|
||||
map[string]sheet.Sheet{
|
||||
"bar": sheet.Sheet{Title: "bar", Tags: []string{"bravo", "charlie"}},
|
||||
"baz": sheet.Sheet{Title: "baz", Tags: []string{"delta"}},
|
||||
},
|
||||
}
|
||||
|
||||
// consolidate the cheatsheets
|
||||
tags := Tags(cheatpaths)
|
||||
|
||||
// specify the expected output
|
||||
want := []string{
|
||||
"alpha",
|
||||
"bravo",
|
||||
"charlie",
|
||||
"delta",
|
||||
}
|
||||
|
||||
// assert that the cheatsheets properly consolidated
|
||||
if !reflect.DeepEqual(tags, want) {
|
||||
t.Errorf(
|
||||
"failed to return tags: want:\n%s, got:\n%s",
|
||||
spew.Sdump(want),
|
||||
spew.Sdump(tags),
|
||||
)
|
||||
}
|
||||
|
||||
}
|
0
mocks/cheatsheets/community/.hiddenfile
Normal file
0
mocks/cheatsheets/community/.hiddenfile
Normal file
4
mocks/cheatsheets/community/bar
Normal file
4
mocks/cheatsheets/community/bar
Normal file
@ -0,0 +1,4 @@
|
||||
---
|
||||
tags: [ community ]
|
||||
---
|
||||
This is the bar cheatsheet.
|
4
mocks/cheatsheets/community/foo
Normal file
4
mocks/cheatsheets/community/foo
Normal file
@ -0,0 +1,4 @@
|
||||
---
|
||||
tags: [ community ]
|
||||
---
|
||||
This is the foo cheatsheet.
|
4
mocks/cheatsheets/personal/bat
Normal file
4
mocks/cheatsheets/personal/bat
Normal file
@ -0,0 +1,4 @@
|
||||
---
|
||||
tags: [ personal ]
|
||||
---
|
||||
This is the bat cheatsheet.
|
4
mocks/cheatsheets/personal/baz
Normal file
4
mocks/cheatsheets/personal/baz
Normal file
@ -0,0 +1,4 @@
|
||||
---
|
||||
tags: [ personal ]
|
||||
---
|
||||
This is the baz cheatsheet.
|
4
mocks/sheet/bad-fm
Normal file
4
mocks/sheet/bad-fm
Normal file
@ -0,0 +1,4 @@
|
||||
---
|
||||
syntax: sh
|
||||
|
||||
This is malformed frontmatter.
|
74
scripts/cheat.bash
Executable file
74
scripts/cheat.bash
Executable file
@ -0,0 +1,74 @@
|
||||
# cheat(1) completion -*- shell-script -*-
|
||||
|
||||
# generate cheatsheet completions, optionally using `fzf`
|
||||
_cheat_complete_cheatsheets()
|
||||
{
|
||||
if [[ "$CHEAT_USE_FZF" = true ]]; then
|
||||
FZF_COMPLETION_TRIGGER='' _fzf_complete "--no-multi" "$@" < <(
|
||||
cheat -l | tail -n +2 | cut -d' ' -f1
|
||||
)
|
||||
else
|
||||
COMPREPLY=( $(compgen -W "$(cheat -l | tail -n +2 | cut -d' ' -f1)" -- "$cur") )
|
||||
fi
|
||||
}
|
||||
|
||||
# generate tag completions, optionally using `fzf`
|
||||
_cheat_complete_tags()
|
||||
{
|
||||
if [ "$CHEAT_USE_FZF" = true ]; then
|
||||
FZF_COMPLETION_TRIGGER='' _fzf_complete "--no-multi" "$@" < <(cheat -T)
|
||||
else
|
||||
COMPREPLY=( $(compgen -W "$(cheat -T)" -- "$cur") )
|
||||
fi
|
||||
}
|
||||
|
||||
# implement the `cheat` autocompletions
|
||||
_cheat()
|
||||
{
|
||||
local cur prev words cword split
|
||||
_init_completion -s || return
|
||||
|
||||
# complete options that are currently being typed: `--col` => `--colorize`
|
||||
if [[ $cur == -* ]]; then
|
||||
COMPREPLY=( $(compgen -W '$(_parse_help "$1" | sed "s/=//g")' -- "$cur") )
|
||||
[[ $COMPREPLY == *= ]] && compopt -o nospace
|
||||
return
|
||||
fi
|
||||
|
||||
# implement completions
|
||||
case $prev in
|
||||
--colorize|-c|\
|
||||
--directories|-d|\
|
||||
--init|\
|
||||
--regex|-r|\
|
||||
--search|-s|\
|
||||
--tags|-T|\
|
||||
--version|-v)
|
||||
# noop the above, which should implement no completions
|
||||
;;
|
||||
--edit|-e)
|
||||
_cheat_complete_cheatsheets
|
||||
;;
|
||||
--list|-l)
|
||||
_cheat_complete_cheatsheets
|
||||
;;
|
||||
--path|-p)
|
||||
COMPREPLY=( $(compgen -W "$(cheat -d | cut -d':' -f1)" -- "$cur") )
|
||||
;;
|
||||
--rm)
|
||||
_cheat_complete_cheatsheets
|
||||
;;
|
||||
--tag|-t)
|
||||
_cheat_complete_tags
|
||||
;;
|
||||
*)
|
||||
_cheat_complete_cheatsheets
|
||||
;;
|
||||
esac
|
||||
|
||||
$split && return
|
||||
|
||||
} &&
|
||||
complete -F _cheat cheat
|
||||
|
||||
# ex: filetype=sh
|
13
scripts/cheat.fish
Executable file
13
scripts/cheat.fish
Executable file
@ -0,0 +1,13 @@
|
||||
complete -c cheat -f -a "(cheat -l | tail -n +2 | cut -d ' ' -f 1)"
|
||||
complete -c cheat -l init -d "Write a default config file to stdout"
|
||||
complete -c cheat -s c -l colorize -d "Colorize output"
|
||||
complete -c cheat -s d -l directories -d "List cheatsheet directories"
|
||||
complete -c cheat -s e -l edit -x -a "(cheat -l | tail -n +2 | cut -d ' ' -f 1)" -d "Edit cheatsheet"
|
||||
complete -c cheat -s l -l list -d "List cheatsheets"
|
||||
complete -c cheat -s p -l path -x -a "(cheat -d | cut -d ':' -f 1)" -d "Return only sheets found on given path"
|
||||
complete -c cheat -s r -l regex -d "Treat search phrase as a regex"
|
||||
complete -c cheat -s s -l search -x -d "Search cheatsheets for given phrase"
|
||||
complete -c cheat -s t -l tag -x -a "(cheat -T)" -d "Return only sheets matching the given tag"
|
||||
complete -c cheat -s T -l tags -d "List all tags in use"
|
||||
complete -c cheat -s v -l version -d "Print the version number"
|
||||
complete -c cheat -l rm -x -a "(cheat -l | tail -n +2 | cut -d ' ' -f 1)" -d "Remove (delete) cheatsheet"
|
65
scripts/cheat.zsh
Executable file
65
scripts/cheat.zsh
Executable file
@ -0,0 +1,65 @@
|
||||
#compdef cheat
|
||||
|
||||
local cheats taglist pathlist
|
||||
|
||||
_cheat_complete_personal_cheatsheets()
|
||||
{
|
||||
cheats=("${(f)$(cheat -l -t personal | tail -n +2 | cut -d' ' -f1)}")
|
||||
_describe -t cheats 'cheats' cheats
|
||||
}
|
||||
|
||||
_cheat_complete_full_cheatsheets()
|
||||
{
|
||||
cheats=("${(f)$(cheat -l | tail -n +2 | cut -d' ' -f1)}")
|
||||
_describe -t cheats 'cheats' cheats
|
||||
}
|
||||
|
||||
_cheat_complete_tags()
|
||||
{
|
||||
taglist=("${(f)$(cheat -T)}")
|
||||
_describe -t taglist 'taglist' taglist
|
||||
}
|
||||
|
||||
_cheat_complete_paths()
|
||||
{
|
||||
pathlist=("${(f)$(cheat -d | cut -d':' -f1)}")
|
||||
_describe -t pathlist 'pathlist' pathlist
|
||||
}
|
||||
|
||||
_cheat() {
|
||||
|
||||
_arguments -C \
|
||||
'(--init)--init[Write a default config file to stdout]: :->none' \
|
||||
'(-c --colorize)'{-c,--colorize}'[Colorize output]: :->none' \
|
||||
'(-d --directories)'{-d,--directories}'[List cheatsheet directories]: :->none' \
|
||||
'(-e --edit)'{-e,--edit}'[Edit <sheet>]: :->personal' \
|
||||
'(-l --list)'{-l,--list}'[List cheatsheets]: :->full' \
|
||||
'(-p --path)'{-p,--path}'[Return only sheets found on path <name>]: :->pathlist' \
|
||||
'(-r --regex)'{-r,--regex}'[Treat search <phrase> as a regex]: :->none' \
|
||||
'(-s --search)'{-s,--search}'[Search cheatsheets for <phrase>]: :->none' \
|
||||
'(-t --tag)'{-t,--tag}'[Return only sheets matching <tag>]: :->taglist' \
|
||||
'(-T --tags)'{-T,--tags}'[List all tags in use]: :->none' \
|
||||
'(-v --version)'{-v,--version}'[Print the version number]: :->none' \
|
||||
'(--rm)--rm[Remove (delete) <sheet>]: :->personal'
|
||||
|
||||
case $state in
|
||||
(none)
|
||||
;;
|
||||
(full)
|
||||
_cheat_complete_full_cheatsheets
|
||||
;;
|
||||
(personal)
|
||||
_cheat_complete_personal_cheatsheets
|
||||
;;
|
||||
(taglist)
|
||||
_cheat_complete_tags
|
||||
;;
|
||||
(pathlist)
|
||||
_cheat_complete_paths
|
||||
;;
|
||||
(*)
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
compdef _cheat cheat
|
@ -1,11 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This function enables you to choose a cheatsheet to view by selecting output
|
||||
# from `cheat -l`. `source` it in your shell to enable it. (Consider renaming
|
||||
# or aliasing it to something convenient.)
|
||||
|
||||
# Arguments passed to this function (like --color) will be passed to the second
|
||||
# invokation of `cheat`.
|
||||
function cheat-fzf {
|
||||
eval `cheat -l | tail -n +2 | fzf | awk -v vars="$*" '{ print "cheat " $1 " -t " $3, vars }'`
|
||||
}
|
46
scripts/git/cheatsheets
Executable file
46
scripts/git/cheatsheets
Executable file
@ -0,0 +1,46 @@
|
||||
#!/bin/sh -e
|
||||
|
||||
pull() {
|
||||
for d in `cheat -d | awk '{print $2}'`;
|
||||
do
|
||||
echo "Update $d"
|
||||
cd "$d"
|
||||
[ -d ".git" ] && git pull || :
|
||||
done
|
||||
|
||||
echo
|
||||
echo "Finished update"
|
||||
}
|
||||
|
||||
push() {
|
||||
for d in `cheat -d | grep -v "community" | awk '{print $2}'`;
|
||||
do
|
||||
cd "$d"
|
||||
if [ -d ".git" ]
|
||||
then
|
||||
echo "Push modifications $d"
|
||||
files=$(git ls-files -mo | tr '\n' ' ')
|
||||
git add -A && git commit -m "Edited files: $files" && git push || :
|
||||
else
|
||||
echo "$(pwd) is not a git managed folder"
|
||||
echo "First connect this to your personal git repository"
|
||||
fi
|
||||
done
|
||||
|
||||
echo
|
||||
echo "Finished push operation"
|
||||
}
|
||||
|
||||
|
||||
if [ "$1" = "pull" ]; then
|
||||
pull
|
||||
elif [ "$1" = "push" ]; then
|
||||
push
|
||||
else
|
||||
echo "Usage:
|
||||
# pull changes
|
||||
cheatsheets pull
|
||||
|
||||
# push changes
|
||||
cheatsheets push"
|
||||
fi
|
19
vendor/github.com/alecthomas/chroma/.gitignore
generated
vendored
Normal file
19
vendor/github.com/alecthomas/chroma/.gitignore
generated
vendored
Normal file
@ -0,0 +1,19 @@
|
||||
# Binaries for programs and plugins
|
||||
*.exe
|
||||
*.dll
|
||||
*.so
|
||||
*.dylib
|
||||
/cmd/chroma/chroma
|
||||
|
||||
# Test binary, build with `go test -c`
|
||||
*.test
|
||||
|
||||
# Output of the go coverage tool, specifically when used with LiteIDE
|
||||
*.out
|
||||
|
||||
# Project-local glide cache, RE: https://github.com/Masterminds/glide/issues/736
|
||||
.glide/
|
||||
|
||||
_models/
|
||||
|
||||
_examples/
|
60
vendor/github.com/alecthomas/chroma/.golangci.yml
generated
vendored
Normal file
60
vendor/github.com/alecthomas/chroma/.golangci.yml
generated
vendored
Normal file
@ -0,0 +1,60 @@
|
||||
run:
|
||||
tests: true
|
||||
skip-dirs:
|
||||
- _examples
|
||||
|
||||
output:
|
||||
print-issued-lines: false
|
||||
|
||||
linters:
|
||||
enable-all: true
|
||||
disable:
|
||||
- maligned
|
||||
- megacheck
|
||||
- lll
|
||||
- gocyclo
|
||||
- dupl
|
||||
- gochecknoglobals
|
||||
- funlen
|
||||
- godox
|
||||
- wsl
|
||||
- gomnd
|
||||
- gocognit
|
||||
- goerr113
|
||||
- nolintlint
|
||||
- testpackage
|
||||
- godot
|
||||
- nestif
|
||||
|
||||
linters-settings:
|
||||
govet:
|
||||
check-shadowing: true
|
||||
gocyclo:
|
||||
min-complexity: 10
|
||||
dupl:
|
||||
threshold: 100
|
||||
goconst:
|
||||
min-len: 8
|
||||
min-occurrences: 3
|
||||
|
||||
issues:
|
||||
max-per-linter: 0
|
||||
max-same: 0
|
||||
exclude-use-default: false
|
||||
exclude:
|
||||
# Captured by errcheck.
|
||||
- '^(G104|G204):'
|
||||
# Very commonly not checked.
|
||||
- 'Error return value of .(.*\.Help|.*\.MarkFlagRequired|(os\.)?std(out|err)\..*|.*Close|.*Flush|os\.Remove(All)?|.*printf?|os\.(Un)?Setenv). is not checked'
|
||||
- 'exported method (.*\.MarshalJSON|.*\.UnmarshalJSON|.*\.EntityURN|.*\.GoString|.*\.Pos) should have comment or be unexported'
|
||||
- 'composite literal uses unkeyed fields'
|
||||
- 'declaration of "err" shadows declaration'
|
||||
- 'should not use dot imports'
|
||||
- 'Potential file inclusion via variable'
|
||||
- 'should have comment or be unexported'
|
||||
- 'comment on exported var .* should be of the form'
|
||||
- 'at least one file in a package should have a package comment'
|
||||
- 'string literal contains the Unicode'
|
||||
- 'methods on the same type should have the same receiver name'
|
||||
- '_TokenType_name should be _TokenTypeName'
|
||||
- '`_TokenType_map` should be `_TokenTypeMap`'
|
35
vendor/github.com/alecthomas/chroma/.goreleaser.yml
generated
vendored
Normal file
35
vendor/github.com/alecthomas/chroma/.goreleaser.yml
generated
vendored
Normal file
@ -0,0 +1,35 @@
|
||||
project_name: chroma
|
||||
release:
|
||||
github:
|
||||
owner: alecthomas
|
||||
name: chroma
|
||||
brews:
|
||||
-
|
||||
install: bin.install "chroma"
|
||||
env:
|
||||
- CGO_ENABLED=0
|
||||
builds:
|
||||
- goos:
|
||||
- linux
|
||||
- darwin
|
||||
- windows
|
||||
goarch:
|
||||
- amd64
|
||||
- "386"
|
||||
goarm:
|
||||
- "6"
|
||||
main: ./cmd/chroma/main.go
|
||||
ldflags: -s -w -X main.version={{.Version}} -X main.commit={{.Commit}} -X main.date={{.Date}}
|
||||
binary: chroma
|
||||
archives:
|
||||
-
|
||||
format: tar.gz
|
||||
name_template: '{{ .Binary }}-{{ .Version }}-{{ .Os }}-{{ .Arch }}{{ if .Arm }}v{{
|
||||
.Arm }}{{ end }}'
|
||||
files:
|
||||
- COPYING
|
||||
- README*
|
||||
snapshot:
|
||||
name_template: SNAPSHOT-{{ .Commit }}
|
||||
checksum:
|
||||
name_template: '{{ .ProjectName }}-{{ .Version }}-checksums.txt'
|
19
vendor/github.com/alecthomas/chroma/COPYING
generated
vendored
Normal file
19
vendor/github.com/alecthomas/chroma/COPYING
generated
vendored
Normal file
@ -0,0 +1,19 @@
|
||||
Copyright (C) 2017 Alec Thomas
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||
this software and associated documentation files (the "Software"), to deal in
|
||||
the Software without restriction, including without limitation the rights to
|
||||
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
|
||||
of the Software, and to permit persons to whom the Software is furnished to do
|
||||
so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
19
vendor/github.com/alecthomas/chroma/Makefile
generated
vendored
Normal file
19
vendor/github.com/alecthomas/chroma/Makefile
generated
vendored
Normal file
@ -0,0 +1,19 @@
|
||||
.PHONY: chromad upload all
|
||||
|
||||
VERSION ?= $(shell git describe --tags --dirty --always)
|
||||
|
||||
all: README.md tokentype_string.go
|
||||
|
||||
README.md: lexers/*/*.go
|
||||
./table.py
|
||||
|
||||
tokentype_string.go: types.go
|
||||
go generate
|
||||
|
||||
chromad:
|
||||
rm -f chromad
|
||||
(export CGOENABLED=0 GOOS=linux ; cd ./cmd/chromad && go build -ldflags="-X 'main.version=$(VERSION)'" -o ../../chromad .)
|
||||
|
||||
upload: chromad
|
||||
scp chromad root@swapoff.org: && \
|
||||
ssh root@swapoff.org 'install -m755 ./chromad /srv/http/swapoff.org/bin && service chromad restart'
|
268
vendor/github.com/alecthomas/chroma/README.md
generated
vendored
Normal file
268
vendor/github.com/alecthomas/chroma/README.md
generated
vendored
Normal file
@ -0,0 +1,268 @@
|
||||
# Chroma — A general purpose syntax highlighter in pure Go [](https://godoc.org/github.com/alecthomas/chroma) [](https://circleci.com/gh/alecthomas/chroma) [](https://goreportcard.com/report/github.com/alecthomas/chroma) [](https://invite.slack.golangbridge.org/)
|
||||
|
||||
> **NOTE:** As Chroma has just been released, its API is still in flux. That said, the high-level interface should not change significantly.
|
||||
|
||||
Chroma takes source code and other structured text and converts it into syntax
|
||||
highlighted HTML, ANSI-coloured text, etc.
|
||||
|
||||
Chroma is based heavily on [Pygments](http://pygments.org/), and includes
|
||||
translators for Pygments lexers and styles.
|
||||
|
||||
<a id="markdown-table-of-contents" name="table-of-contents"></a>
|
||||
## Table of Contents
|
||||
|
||||
<!-- TOC -->
|
||||
|
||||
1. [Table of Contents](#table-of-contents)
|
||||
2. [Supported languages](#supported-languages)
|
||||
3. [Try it](#try-it)
|
||||
4. [Using the library](#using-the-library)
|
||||
1. [Quick start](#quick-start)
|
||||
2. [Identifying the language](#identifying-the-language)
|
||||
3. [Formatting the output](#formatting-the-output)
|
||||
4. [The HTML formatter](#the-html-formatter)
|
||||
5. [More detail](#more-detail)
|
||||
1. [Lexers](#lexers)
|
||||
2. [Formatters](#formatters)
|
||||
3. [Styles](#styles)
|
||||
6. [Command-line interface](#command-line-interface)
|
||||
7. [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
|
||||
|
||||
<!-- /TOC -->
|
||||
|
||||
<a id="markdown-supported-languages" name="supported-languages"></a>
|
||||
## Supported languages
|
||||
|
||||
Prefix | Language
|
||||
:----: | --------
|
||||
A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Angular2, ANTLR, ApacheConf, APL, AppleScript, Arduino, Awk
|
||||
B | Ballerina, Base Makefile, Bash, Batchfile, BibTeX, BlitzBasic, BNF, Brainfuck
|
||||
C | C, C#, C++, Caddyfile, Caddyfile Directives, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython
|
||||
D | D, Dart, Diff, Django/Jinja, Docker, DTD, Dylan
|
||||
E | EBNF, Elixir, Elm, EmacsLisp, Erlang
|
||||
F | Factor, Fish, Forth, Fortran, FSharp
|
||||
G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, Gherkin, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groovy
|
||||
H | Handlebars, Haskell, Haxe, HCL, Hexdump, HLB, HTML, HTTP, Hy
|
||||
I | Idris, Igor, INI, Io
|
||||
J | J, Java, JavaScript, JSON, Julia, Jungle
|
||||
K | Kotlin
|
||||
L | Lighttpd configuration file, LLVM, Lua
|
||||
M | Mako, markdown, Mason, Mathematica, Matlab, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL
|
||||
N | NASM, Newspeak, Nginx configuration file, Nim, Nix
|
||||
O | Objective-C, OCaml, Octave, OpenSCAD, Org Mode
|
||||
P | PacmanConf, Perl, PHP, PHTML, Pig, PkgConfig, PL/pgSQL, plaintext, Pony, PostgreSQL SQL dialect, PostScript, POVRay, PowerShell, Prolog, PromQL, Protocol Buffer, Puppet, Python, Python 3
|
||||
Q | QBasic
|
||||
R | R, Racket, Ragel, react, ReasonML, reg, reStructuredText, Rexx, Ruby, Rust
|
||||
S | SAS, Sass, Scala, Scheme, Scilab, SCSS, Smalltalk, Smarty, Snobol, Solidity, SPARQL, SQL, SquidConf, Standard ML, Stylus, Swift, SYSTEMD, systemverilog
|
||||
T | TableGen, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData
|
||||
V | VB.net, verilog, VHDL, VimL, vue
|
||||
W | WDTE
|
||||
X | XML, Xorg
|
||||
Y | YAML, YANG
|
||||
Z | Zig
|
||||
|
||||
|
||||
_I will attempt to keep this section up to date, but an authoritative list can be
|
||||
displayed with `chroma --list`._
|
||||
|
||||
<a id="markdown-try-it" name="try-it"></a>
|
||||
## Try it
|
||||
|
||||
Try out various languages and styles on the [Chroma Playground](https://swapoff.org/chroma/playground/).
|
||||
|
||||
<a id="markdown-using-the-library" name="using-the-library"></a>
|
||||
## Using the library
|
||||
|
||||
Chroma, like Pygments, has the concepts of
|
||||
[lexers](https://github.com/alecthomas/chroma/tree/master/lexers),
|
||||
[formatters](https://github.com/alecthomas/chroma/tree/master/formatters) and
|
||||
[styles](https://github.com/alecthomas/chroma/tree/master/styles).
|
||||
|
||||
Lexers convert source text into a stream of tokens, styles specify how token
|
||||
types are mapped to colours, and formatters convert tokens and styles into
|
||||
formatted output.
|
||||
|
||||
A package exists for each of these, containing a global `Registry` variable
|
||||
with all of the registered implementations. There are also helper functions
|
||||
for using the registry in each package, such as looking up lexers by name or
|
||||
matching filenames, etc.
|
||||
|
||||
In all cases, if a lexer, formatter or style can not be determined, `nil` will
|
||||
be returned. In this situation you may want to default to the `Fallback`
|
||||
value in each respective package, which provides sane defaults.
|
||||
|
||||
<a id="markdown-quick-start" name="quick-start"></a>
|
||||
### Quick start
|
||||
|
||||
A convenience function exists that can be used to simply format some source
|
||||
text, without any effort:
|
||||
|
||||
```go
|
||||
err := quick.Highlight(os.Stdout, someSourceCode, "go", "html", "monokai")
|
||||
```
|
||||
|
||||
<a id="markdown-identifying-the-language" name="identifying-the-language"></a>
|
||||
### Identifying the language
|
||||
|
||||
To highlight code, you'll first have to identify what language the code is
|
||||
written in. There are three primary ways to do that:
|
||||
|
||||
1. Detect the language from its filename.
|
||||
|
||||
```go
|
||||
lexer := lexers.Match("foo.go")
|
||||
```
|
||||
|
||||
3. Explicitly specify the language by its Chroma syntax ID (a full list is available from `lexers.Names()`).
|
||||
|
||||
```go
|
||||
lexer := lexers.Get("go")
|
||||
```
|
||||
|
||||
3. Detect the language from its content.
|
||||
|
||||
```go
|
||||
lexer := lexers.Analyse("package main\n\nfunc main()\n{\n}\n")
|
||||
```
|
||||
|
||||
In all cases, `nil` will be returned if the language can not be identified.
|
||||
|
||||
```go
|
||||
if lexer == nil {
|
||||
lexer = lexers.Fallback
|
||||
}
|
||||
```
|
||||
|
||||
At this point, it should be noted that some lexers can be extremely chatty. To
|
||||
mitigate this, you can use the coalescing lexer to coalesce runs of identical
|
||||
token types into a single token:
|
||||
|
||||
```go
|
||||
lexer = chroma.Coalesce(lexer)
|
||||
```
|
||||
|
||||
<a id="markdown-formatting-the-output" name="formatting-the-output"></a>
|
||||
### Formatting the output
|
||||
|
||||
Once a language is identified you will need to pick a formatter and a style (theme).
|
||||
|
||||
```go
|
||||
style := styles.Get("swapoff")
|
||||
if style == nil {
|
||||
style = styles.Fallback
|
||||
}
|
||||
formatter := formatters.Get("html")
|
||||
if formatter == nil {
|
||||
formatter = formatters.Fallback
|
||||
}
|
||||
```
|
||||
|
||||
Then obtain an iterator over the tokens:
|
||||
|
||||
```go
|
||||
contents, err := ioutil.ReadAll(r)
|
||||
iterator, err := lexer.Tokenise(nil, string(contents))
|
||||
```
|
||||
|
||||
And finally, format the tokens from the iterator:
|
||||
|
||||
```go
|
||||
err := formatter.Format(w, style, iterator)
|
||||
```
|
||||
|
||||
<a id="markdown-the-html-formatter" name="the-html-formatter"></a>
|
||||
### The HTML formatter
|
||||
|
||||
By default the `html` registered formatter generates standalone HTML with
|
||||
embedded CSS. More flexibility is available through the `formatters/html` package.
|
||||
|
||||
Firstly, the output generated by the formatter can be customised with the
|
||||
following constructor options:
|
||||
|
||||
- `Standalone()` - generate standalone HTML with embedded CSS.
|
||||
- `WithClasses()` - use classes rather than inlined style attributes.
|
||||
- `ClassPrefix(prefix)` - prefix each generated CSS class.
|
||||
- `TabWidth(width)` - Set the rendered tab width, in characters.
|
||||
- `WithLineNumbers()` - Render line numbers (style with `LineNumbers`).
|
||||
- `LinkableLineNumbers()` - Make the line numbers linkable and be a link to themselves.
|
||||
- `HighlightLines(ranges)` - Highlight lines in these ranges (style with `LineHighlight`).
|
||||
- `LineNumbersInTable()` - Use a table for formatting line numbers and code, rather than spans.
|
||||
|
||||
If `WithClasses()` is used, the corresponding CSS can be obtained from the formatter with:
|
||||
|
||||
```go
|
||||
formatter := html.New(html.WithClasses())
|
||||
err := formatter.WriteCSS(w, style)
|
||||
```
|
||||
|
||||
<a id="markdown-more-detail" name="more-detail"></a>
|
||||
## More detail
|
||||
|
||||
<a id="markdown-lexers" name="lexers"></a>
|
||||
### Lexers
|
||||
|
||||
See the [Pygments documentation](http://pygments.org/docs/lexerdevelopment/)
|
||||
for details on implementing lexers. Most concepts apply directly to Chroma,
|
||||
but see existing lexer implementations for real examples.
|
||||
|
||||
In many cases lexers can be automatically converted directly from Pygments by
|
||||
using the included Python 3 script `pygments2chroma.py`. I use something like
|
||||
the following:
|
||||
|
||||
```sh
|
||||
python3 ~/Projects/chroma/_tools/pygments2chroma.py \
|
||||
pygments.lexers.jvm.KotlinLexer \
|
||||
> ~/Projects/chroma/lexers/kotlin.go \
|
||||
&& gofmt -s -w ~/Projects/chroma/lexers/*.go
|
||||
```
|
||||
|
||||
See notes in [pygments-lexers.txt](https://github.com/alecthomas/chroma/blob/master/pygments-lexers.txt)
|
||||
for a list of lexers, and notes on some of the issues importing them.
|
||||
|
||||
<a id="markdown-formatters" name="formatters"></a>
|
||||
### Formatters
|
||||
|
||||
Chroma supports HTML output, as well as terminal output in 8 colour, 256 colour, and true-colour.
|
||||
|
||||
A `noop` formatter is included that outputs the token text only, and a `tokens`
|
||||
formatter outputs raw tokens. The latter is useful for debugging lexers.
|
||||
|
||||
<a id="markdown-styles" name="styles"></a>
|
||||
### Styles
|
||||
|
||||
Chroma styles use the [same syntax](http://pygments.org/docs/styles/) as Pygments.
|
||||
|
||||
All Pygments styles have been converted to Chroma using the `_tools/style.py` script.
|
||||
|
||||
When you work with one of [Chroma's styles](https://github.com/alecthomas/chroma/tree/master/styles), know that the `chroma.Background` token type provides the default style for tokens. It does so by defining a foreground color and background color.
|
||||
|
||||
For example, this gives each token name not defined in the style a default color of `#f8f8f8` and uses `#000000` for the highlighted code block's background:
|
||||
|
||||
~~~go
|
||||
chroma.Background: "#f8f8f2 bg:#000000",
|
||||
~~~
|
||||
|
||||
Also, token types in a style file are hierarchical. For instance, when `CommentSpecial` is not defined, Chroma uses the token style from `Comment`. So when several comment tokens use the same color, you'll only need to define `Comment` and override the one that has a different color.
|
||||
|
||||
For a quick overview of the available styles and how they look, check out the [Chroma Style Gallery](https://xyproto.github.io/splash/docs/).
|
||||
|
||||
<a id="markdown-command-line-interface" name="command-line-interface"></a>
|
||||
## Command-line interface
|
||||
|
||||
A command-line interface to Chroma is included. It can be installed with:
|
||||
|
||||
```sh
|
||||
go get -u github.com/alecthomas/chroma/cmd/chroma
|
||||
```
|
||||
|
||||
<a id="markdown-whats-missing-compared-to-pygments" name="whats-missing-compared-to-pygments"></a>
|
||||
## What's missing compared to Pygments?
|
||||
|
||||
- Quite a few lexers, for various reasons (pull-requests welcome):
|
||||
- Pygments lexers for complex languages often include custom code to
|
||||
handle certain aspects, such as Perl6's ability to nest code inside
|
||||
regular expressions. These require time and effort to convert.
|
||||
- I mostly only converted languages I had heard of, to reduce the porting cost.
|
||||
- Some more esoteric features of Pygments are omitted for simplicity.
|
||||
- Though the Chroma API supports content detection, very few languages support them.
|
||||
I have plans to implement a statistical analyser at some point, but not enough time.
|
35
vendor/github.com/alecthomas/chroma/coalesce.go
generated
vendored
Normal file
35
vendor/github.com/alecthomas/chroma/coalesce.go
generated
vendored
Normal file
@ -0,0 +1,35 @@
|
||||
package chroma
|
||||
|
||||
// Coalesce is a Lexer interceptor that collapses runs of common types into a single token.
|
||||
func Coalesce(lexer Lexer) Lexer { return &coalescer{lexer} }
|
||||
|
||||
type coalescer struct{ Lexer }
|
||||
|
||||
func (d *coalescer) Tokenise(options *TokeniseOptions, text string) (Iterator, error) {
|
||||
var prev Token
|
||||
it, err := d.Lexer.Tokenise(options, text)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return func() Token {
|
||||
for token := it(); token != (EOF); token = it() {
|
||||
if len(token.Value) == 0 {
|
||||
continue
|
||||
}
|
||||
if prev == EOF {
|
||||
prev = token
|
||||
} else {
|
||||
if prev.Type == token.Type && len(prev.Value) < 8192 {
|
||||
prev.Value += token.Value
|
||||
} else {
|
||||
out := prev
|
||||
prev = token
|
||||
return out
|
||||
}
|
||||
}
|
||||
}
|
||||
out := prev
|
||||
prev = EOF
|
||||
return out
|
||||
}, nil
|
||||
}
|
164
vendor/github.com/alecthomas/chroma/colour.go
generated
vendored
Normal file
164
vendor/github.com/alecthomas/chroma/colour.go
generated
vendored
Normal file
@ -0,0 +1,164 @@
|
||||
package chroma
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"math"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// ANSI2RGB maps ANSI colour names, as supported by Chroma, to hex RGB values.
|
||||
var ANSI2RGB = map[string]string{
|
||||
"#ansiblack": "000000",
|
||||
"#ansidarkred": "7f0000",
|
||||
"#ansidarkgreen": "007f00",
|
||||
"#ansibrown": "7f7fe0",
|
||||
"#ansidarkblue": "00007f",
|
||||
"#ansipurple": "7f007f",
|
||||
"#ansiteal": "007f7f",
|
||||
"#ansilightgray": "e5e5e5",
|
||||
// Normal
|
||||
"#ansidarkgray": "555555",
|
||||
"#ansired": "ff0000",
|
||||
"#ansigreen": "00ff00",
|
||||
"#ansiyellow": "ffff00",
|
||||
"#ansiblue": "0000ff",
|
||||
"#ansifuchsia": "ff00ff",
|
||||
"#ansiturquoise": "00ffff",
|
||||
"#ansiwhite": "ffffff",
|
||||
|
||||
// Aliases without the "ansi" prefix, because...why?
|
||||
"#black": "000000",
|
||||
"#darkred": "7f0000",
|
||||
"#darkgreen": "007f00",
|
||||
"#brown": "7f7fe0",
|
||||
"#darkblue": "00007f",
|
||||
"#purple": "7f007f",
|
||||
"#teal": "007f7f",
|
||||
"#lightgray": "e5e5e5",
|
||||
// Normal
|
||||
"#darkgray": "555555",
|
||||
"#red": "ff0000",
|
||||
"#green": "00ff00",
|
||||
"#yellow": "ffff00",
|
||||
"#blue": "0000ff",
|
||||
"#fuchsia": "ff00ff",
|
||||
"#turquoise": "00ffff",
|
||||
"#white": "ffffff",
|
||||
}
|
||||
|
||||
// Colour represents an RGB colour.
|
||||
type Colour int32
|
||||
|
||||
// NewColour creates a Colour directly from RGB values.
|
||||
func NewColour(r, g, b uint8) Colour {
|
||||
return ParseColour(fmt.Sprintf("%02x%02x%02x", r, g, b))
|
||||
}
|
||||
|
||||
// Distance between this colour and another.
|
||||
//
|
||||
// This uses the approach described here (https://www.compuphase.com/cmetric.htm).
|
||||
// This is not as accurate as LAB, et. al. but is *vastly* simpler and sufficient for our needs.
|
||||
func (c Colour) Distance(e2 Colour) float64 {
|
||||
ar, ag, ab := int64(c.Red()), int64(c.Green()), int64(c.Blue())
|
||||
br, bg, bb := int64(e2.Red()), int64(e2.Green()), int64(e2.Blue())
|
||||
rmean := (ar + br) / 2
|
||||
r := ar - br
|
||||
g := ag - bg
|
||||
b := ab - bb
|
||||
return math.Sqrt(float64((((512 + rmean) * r * r) >> 8) + 4*g*g + (((767 - rmean) * b * b) >> 8)))
|
||||
}
|
||||
|
||||
// Brighten returns a copy of this colour with its brightness adjusted.
|
||||
//
|
||||
// If factor is negative, the colour is darkened.
|
||||
//
|
||||
// Uses approach described here (http://www.pvladov.com/2012/09/make-color-lighter-or-darker.html).
|
||||
func (c Colour) Brighten(factor float64) Colour {
|
||||
r := float64(c.Red())
|
||||
g := float64(c.Green())
|
||||
b := float64(c.Blue())
|
||||
|
||||
if factor < 0 {
|
||||
factor++
|
||||
r *= factor
|
||||
g *= factor
|
||||
b *= factor
|
||||
} else {
|
||||
r = (255-r)*factor + r
|
||||
g = (255-g)*factor + g
|
||||
b = (255-b)*factor + b
|
||||
}
|
||||
return NewColour(uint8(r), uint8(g), uint8(b))
|
||||
}
|
||||
|
||||
// BrightenOrDarken brightens a colour if it is < 0.5 brighteness or darkens if > 0.5 brightness.
|
||||
func (c Colour) BrightenOrDarken(factor float64) Colour {
|
||||
if c.Brightness() < 0.5 {
|
||||
return c.Brighten(factor)
|
||||
}
|
||||
return c.Brighten(-factor)
|
||||
}
|
||||
|
||||
// Brightness of the colour (roughly) in the range 0.0 to 1.0
|
||||
func (c Colour) Brightness() float64 {
|
||||
return (float64(c.Red()) + float64(c.Green()) + float64(c.Blue())) / 255.0 / 3.0
|
||||
}
|
||||
|
||||
// ParseColour in the forms #rgb, #rrggbb, #ansi<colour>, or #<colour>.
|
||||
// Will return an "unset" colour if invalid.
|
||||
func ParseColour(colour string) Colour {
|
||||
colour = normaliseColour(colour)
|
||||
n, err := strconv.ParseUint(colour, 16, 32)
|
||||
if err != nil {
|
||||
return 0
|
||||
}
|
||||
return Colour(n + 1)
|
||||
}
|
||||
|
||||
// MustParseColour is like ParseColour except it panics if the colour is invalid.
|
||||
//
|
||||
// Will panic if colour is in an invalid format.
|
||||
func MustParseColour(colour string) Colour {
|
||||
parsed := ParseColour(colour)
|
||||
if !parsed.IsSet() {
|
||||
panic(fmt.Errorf("invalid colour %q", colour))
|
||||
}
|
||||
return parsed
|
||||
}
|
||||
|
||||
// IsSet returns true if the colour is set.
|
||||
func (c Colour) IsSet() bool { return c != 0 }
|
||||
|
||||
func (c Colour) String() string { return fmt.Sprintf("#%06x", int(c-1)) }
|
||||
func (c Colour) GoString() string { return fmt.Sprintf("Colour(0x%06x)", int(c-1)) }
|
||||
|
||||
// Red component of colour.
|
||||
func (c Colour) Red() uint8 { return uint8(((c - 1) >> 16) & 0xff) }
|
||||
|
||||
// Green component of colour.
|
||||
func (c Colour) Green() uint8 { return uint8(((c - 1) >> 8) & 0xff) }
|
||||
|
||||
// Blue component of colour.
|
||||
func (c Colour) Blue() uint8 { return uint8((c - 1) & 0xff) }
|
||||
|
||||
// Colours is an orderable set of colours.
|
||||
type Colours []Colour
|
||||
|
||||
func (c Colours) Len() int { return len(c) }
|
||||
func (c Colours) Swap(i, j int) { c[i], c[j] = c[j], c[i] }
|
||||
func (c Colours) Less(i, j int) bool { return c[i] < c[j] }
|
||||
|
||||
// Convert colours to #rrggbb.
|
||||
func normaliseColour(colour string) string {
|
||||
if ansi, ok := ANSI2RGB[colour]; ok {
|
||||
return ansi
|
||||
}
|
||||
if strings.HasPrefix(colour, "#") {
|
||||
colour = colour[1:]
|
||||
if len(colour) == 3 {
|
||||
return colour[0:1] + colour[0:1] + colour[1:2] + colour[1:2] + colour[2:3] + colour[2:3]
|
||||
}
|
||||
}
|
||||
return colour
|
||||
}
|
137
vendor/github.com/alecthomas/chroma/delegate.go
generated
vendored
Normal file
137
vendor/github.com/alecthomas/chroma/delegate.go
generated
vendored
Normal file
@ -0,0 +1,137 @@
|
||||
package chroma
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
)
|
||||
|
||||
type delegatingLexer struct {
|
||||
root Lexer
|
||||
language Lexer
|
||||
}
|
||||
|
||||
// DelegatingLexer combines two lexers to handle the common case of a language embedded inside another, such as PHP
|
||||
// inside HTML or PHP inside plain text.
|
||||
//
|
||||
// It takes two lexer as arguments: a root lexer and a language lexer. First everything is scanned using the language
|
||||
// lexer, which must return "Other" for unrecognised tokens. Then all "Other" tokens are lexed using the root lexer.
|
||||
// Finally, these two sets of tokens are merged.
|
||||
//
|
||||
// The lexers from the template lexer package use this base lexer.
|
||||
func DelegatingLexer(root Lexer, language Lexer) Lexer {
|
||||
return &delegatingLexer{
|
||||
root: root,
|
||||
language: language,
|
||||
}
|
||||
}
|
||||
|
||||
func (d *delegatingLexer) Config() *Config {
|
||||
return d.language.Config()
|
||||
}
|
||||
|
||||
// An insertion is the character range where language tokens should be inserted.
|
||||
type insertion struct {
|
||||
start, end int
|
||||
tokens []Token
|
||||
}
|
||||
|
||||
func (d *delegatingLexer) Tokenise(options *TokeniseOptions, text string) (Iterator, error) { // nolint: gocognit
|
||||
tokens, err := Tokenise(Coalesce(d.language), options, text)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
// Compute insertions and gather "Other" tokens.
|
||||
others := &bytes.Buffer{}
|
||||
insertions := []*insertion{}
|
||||
var insert *insertion
|
||||
offset := 0
|
||||
var last Token
|
||||
for _, t := range tokens {
|
||||
if t.Type == Other {
|
||||
if last != EOF && insert != nil && last.Type != Other {
|
||||
insert.end = offset
|
||||
}
|
||||
others.WriteString(t.Value)
|
||||
} else {
|
||||
if last == EOF || last.Type == Other {
|
||||
insert = &insertion{start: offset}
|
||||
insertions = append(insertions, insert)
|
||||
}
|
||||
insert.tokens = append(insert.tokens, t)
|
||||
}
|
||||
last = t
|
||||
offset += len(t.Value)
|
||||
}
|
||||
|
||||
if len(insertions) == 0 {
|
||||
return d.root.Tokenise(options, text)
|
||||
}
|
||||
|
||||
// Lex the other tokens.
|
||||
rootTokens, err := Tokenise(Coalesce(d.root), options, others.String())
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// Interleave the two sets of tokens.
|
||||
var out []Token
|
||||
offset = 0 // Offset into text.
|
||||
tokenIndex := 0
|
||||
nextToken := func() Token {
|
||||
if tokenIndex >= len(rootTokens) {
|
||||
return EOF
|
||||
}
|
||||
t := rootTokens[tokenIndex]
|
||||
tokenIndex++
|
||||
return t
|
||||
}
|
||||
insertionIndex := 0
|
||||
nextInsertion := func() *insertion {
|
||||
if insertionIndex >= len(insertions) {
|
||||
return nil
|
||||
}
|
||||
i := insertions[insertionIndex]
|
||||
insertionIndex++
|
||||
return i
|
||||
}
|
||||
t := nextToken()
|
||||
i := nextInsertion()
|
||||
for t != EOF || i != nil {
|
||||
// fmt.Printf("%d->%d:%q %d->%d:%q\n", offset, offset+len(t.Value), t.Value, i.start, i.end, Stringify(i.tokens...))
|
||||
if t == EOF || (i != nil && i.start < offset+len(t.Value)) {
|
||||
var l Token
|
||||
l, t = splitToken(t, i.start-offset)
|
||||
if l != EOF {
|
||||
out = append(out, l)
|
||||
offset += len(l.Value)
|
||||
}
|
||||
out = append(out, i.tokens...)
|
||||
offset += i.end - i.start
|
||||
if t == EOF {
|
||||
t = nextToken()
|
||||
}
|
||||
i = nextInsertion()
|
||||
} else {
|
||||
out = append(out, t)
|
||||
offset += len(t.Value)
|
||||
t = nextToken()
|
||||
}
|
||||
}
|
||||
return Literator(out...), nil
|
||||
}
|
||||
|
||||
func splitToken(t Token, offset int) (l Token, r Token) {
|
||||
if t == EOF {
|
||||
return EOF, EOF
|
||||
}
|
||||
if offset == 0 {
|
||||
return EOF, t
|
||||
}
|
||||
if offset == len(t.Value) {
|
||||
return t, EOF
|
||||
}
|
||||
l = t.Clone()
|
||||
r = t.Clone()
|
||||
l.Value = l.Value[:offset]
|
||||
r.Value = r.Value[offset:]
|
||||
return
|
||||
}
|
7
vendor/github.com/alecthomas/chroma/doc.go
generated
vendored
Normal file
7
vendor/github.com/alecthomas/chroma/doc.go
generated
vendored
Normal file
@ -0,0 +1,7 @@
|
||||
// Package chroma takes source code and other structured text and converts it into syntax highlighted HTML, ANSI-
|
||||
// coloured text, etc.
|
||||
//
|
||||
// Chroma is based heavily on Pygments, and includes translators for Pygments lexers and styles.
|
||||
//
|
||||
// For more information, go here: https://github.com/alecthomas/chroma
|
||||
package chroma
|
43
vendor/github.com/alecthomas/chroma/formatter.go
generated
vendored
Normal file
43
vendor/github.com/alecthomas/chroma/formatter.go
generated
vendored
Normal file
@ -0,0 +1,43 @@
|
||||
package chroma
|
||||
|
||||
import (
|
||||
"io"
|
||||
)
|
||||
|
||||
// A Formatter for Chroma lexers.
|
||||
type Formatter interface {
|
||||
// Format returns a formatting function for tokens.
|
||||
//
|
||||
// If the iterator panics, the Formatter should recover.
|
||||
Format(w io.Writer, style *Style, iterator Iterator) error
|
||||
}
|
||||
|
||||
// A FormatterFunc is a Formatter implemented as a function.
|
||||
//
|
||||
// Guards against iterator panics.
|
||||
type FormatterFunc func(w io.Writer, style *Style, iterator Iterator) error
|
||||
|
||||
func (f FormatterFunc) Format(w io.Writer, s *Style, it Iterator) (err error) { // nolint
|
||||
defer func() {
|
||||
if perr := recover(); perr != nil {
|
||||
err = perr.(error)
|
||||
}
|
||||
}()
|
||||
return f(w, s, it)
|
||||
}
|
||||
|
||||
type recoveringFormatter struct {
|
||||
Formatter
|
||||
}
|
||||
|
||||
func (r recoveringFormatter) Format(w io.Writer, s *Style, it Iterator) (err error) {
|
||||
defer func() {
|
||||
if perr := recover(); perr != nil {
|
||||
err = perr.(error)
|
||||
}
|
||||
}()
|
||||
return r.Formatter.Format(w, s, it)
|
||||
}
|
||||
|
||||
// RecoveringFormatter wraps a formatter with panic recovery.
|
||||
func RecoveringFormatter(formatter Formatter) Formatter { return recoveringFormatter{formatter} }
|
57
vendor/github.com/alecthomas/chroma/formatters/api.go
generated
vendored
Normal file
57
vendor/github.com/alecthomas/chroma/formatters/api.go
generated
vendored
Normal file
@ -0,0 +1,57 @@
|
||||
package formatters
|
||||
|
||||
import (
|
||||
"io"
|
||||
"sort"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
"github.com/alecthomas/chroma/formatters/html"
|
||||
"github.com/alecthomas/chroma/formatters/svg"
|
||||
)
|
||||
|
||||
var (
|
||||
// NoOp formatter.
|
||||
NoOp = Register("noop", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, iterator chroma.Iterator) error {
|
||||
for t := iterator(); t != chroma.EOF; t = iterator() {
|
||||
if _, err := io.WriteString(w, t.Value); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}))
|
||||
// Default HTML formatter outputs self-contained HTML.
|
||||
htmlFull = Register("html", html.New(html.Standalone(true), html.WithClasses(true))) // nolint
|
||||
SVG = Register("svg", svg.New(svg.EmbedFont("Liberation Mono", svg.FontLiberationMono, svg.WOFF)))
|
||||
)
|
||||
|
||||
// Fallback formatter.
|
||||
var Fallback = NoOp
|
||||
|
||||
// Registry of Formatters.
|
||||
var Registry = map[string]chroma.Formatter{}
|
||||
|
||||
// Names of registered formatters.
|
||||
func Names() []string {
|
||||
out := []string{}
|
||||
for name := range Registry {
|
||||
out = append(out, name)
|
||||
}
|
||||
sort.Strings(out)
|
||||
return out
|
||||
}
|
||||
|
||||
// Get formatter by name.
|
||||
//
|
||||
// If the given formatter is not found, the Fallback formatter will be returned.
|
||||
func Get(name string) chroma.Formatter {
|
||||
if f, ok := Registry[name]; ok {
|
||||
return f
|
||||
}
|
||||
return Fallback
|
||||
}
|
||||
|
||||
// Register a named formatter.
|
||||
func Register(name string, formatter chroma.Formatter) chroma.Formatter {
|
||||
Registry[name] = formatter
|
||||
return formatter
|
||||
}
|
455
vendor/github.com/alecthomas/chroma/formatters/html/html.go
generated
vendored
Normal file
455
vendor/github.com/alecthomas/chroma/formatters/html/html.go
generated
vendored
Normal file
@ -0,0 +1,455 @@
|
||||
package html
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"html"
|
||||
"io"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// Option sets an option of the HTML formatter.
|
||||
type Option func(f *Formatter)
|
||||
|
||||
// Standalone configures the HTML formatter for generating a standalone HTML document.
|
||||
func Standalone(b bool) Option { return func(f *Formatter) { f.standalone = b } }
|
||||
|
||||
// ClassPrefix sets the CSS class prefix.
|
||||
func ClassPrefix(prefix string) Option { return func(f *Formatter) { f.prefix = prefix } }
|
||||
|
||||
// WithClasses emits HTML using CSS classes, rather than inline styles.
|
||||
func WithClasses(b bool) Option { return func(f *Formatter) { f.Classes = b } }
|
||||
|
||||
// WithAllClasses disables an optimisation that omits redundant CSS classes.
|
||||
func WithAllClasses(b bool) Option { return func(f *Formatter) { f.allClasses = b } }
|
||||
|
||||
// TabWidth sets the number of characters for a tab. Defaults to 8.
|
||||
func TabWidth(width int) Option { return func(f *Formatter) { f.tabWidth = width } }
|
||||
|
||||
// PreventSurroundingPre prevents the surrounding pre tags around the generated code.
|
||||
func PreventSurroundingPre(b bool) Option {
|
||||
return func(f *Formatter) {
|
||||
if b {
|
||||
f.preWrapper = nopPreWrapper
|
||||
} else {
|
||||
f.preWrapper = defaultPreWrapper
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// WithPreWrapper allows control of the surrounding pre tags.
|
||||
func WithPreWrapper(wrapper PreWrapper) Option {
|
||||
return func(f *Formatter) {
|
||||
f.preWrapper = wrapper
|
||||
}
|
||||
}
|
||||
|
||||
// WithLineNumbers formats output with line numbers.
|
||||
func WithLineNumbers(b bool) Option {
|
||||
return func(f *Formatter) {
|
||||
f.lineNumbers = b
|
||||
}
|
||||
}
|
||||
|
||||
// LineNumbersInTable will, when combined with WithLineNumbers, separate the line numbers
|
||||
// and code in table td's, which make them copy-and-paste friendly.
|
||||
func LineNumbersInTable(b bool) Option {
|
||||
return func(f *Formatter) {
|
||||
f.lineNumbersInTable = b
|
||||
}
|
||||
}
|
||||
|
||||
// LinkableLineNumbers decorates the line numbers HTML elements with an "id"
|
||||
// attribute so they can be linked.
|
||||
func LinkableLineNumbers(b bool, prefix string) Option {
|
||||
return func(f *Formatter) {
|
||||
f.linkableLineNumbers = b
|
||||
f.lineNumbersIDPrefix = prefix
|
||||
}
|
||||
}
|
||||
|
||||
// HighlightLines higlights the given line ranges with the Highlight style.
|
||||
//
|
||||
// A range is the beginning and ending of a range as 1-based line numbers, inclusive.
|
||||
func HighlightLines(ranges [][2]int) Option {
|
||||
return func(f *Formatter) {
|
||||
f.highlightRanges = ranges
|
||||
sort.Sort(f.highlightRanges)
|
||||
}
|
||||
}
|
||||
|
||||
// BaseLineNumber sets the initial number to start line numbering at. Defaults to 1.
|
||||
func BaseLineNumber(n int) Option {
|
||||
return func(f *Formatter) {
|
||||
f.baseLineNumber = n
|
||||
}
|
||||
}
|
||||
|
||||
// New HTML formatter.
|
||||
func New(options ...Option) *Formatter {
|
||||
f := &Formatter{
|
||||
baseLineNumber: 1,
|
||||
preWrapper: defaultPreWrapper,
|
||||
}
|
||||
for _, option := range options {
|
||||
option(f)
|
||||
}
|
||||
return f
|
||||
}
|
||||
|
||||
// PreWrapper defines the operations supported in WithPreWrapper.
|
||||
type PreWrapper interface {
|
||||
// Start is called to write a start <pre> element.
|
||||
// The code flag tells whether this block surrounds
|
||||
// highlighted code. This will be false when surrounding
|
||||
// line numbers.
|
||||
Start(code bool, styleAttr string) string
|
||||
|
||||
// End is called to write the end </pre> element.
|
||||
End(code bool) string
|
||||
}
|
||||
|
||||
type preWrapper struct {
|
||||
start func(code bool, styleAttr string) string
|
||||
end func(code bool) string
|
||||
}
|
||||
|
||||
func (p preWrapper) Start(code bool, styleAttr string) string {
|
||||
return p.start(code, styleAttr)
|
||||
}
|
||||
|
||||
func (p preWrapper) End(code bool) string {
|
||||
return p.end(code)
|
||||
}
|
||||
|
||||
var (
|
||||
nopPreWrapper = preWrapper{
|
||||
start: func(code bool, styleAttr string) string { return "" },
|
||||
end: func(code bool) string { return "" },
|
||||
}
|
||||
defaultPreWrapper = preWrapper{
|
||||
start: func(code bool, styleAttr string) string {
|
||||
return fmt.Sprintf("<pre%s>", styleAttr)
|
||||
},
|
||||
end: func(code bool) string {
|
||||
return "</pre>"
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
// Formatter that generates HTML.
|
||||
type Formatter struct {
|
||||
standalone bool
|
||||
prefix string
|
||||
Classes bool // Exported field to detect when classes are being used
|
||||
allClasses bool
|
||||
preWrapper PreWrapper
|
||||
tabWidth int
|
||||
lineNumbers bool
|
||||
lineNumbersInTable bool
|
||||
linkableLineNumbers bool
|
||||
lineNumbersIDPrefix string
|
||||
highlightRanges highlightRanges
|
||||
baseLineNumber int
|
||||
}
|
||||
|
||||
type highlightRanges [][2]int
|
||||
|
||||
func (h highlightRanges) Len() int { return len(h) }
|
||||
func (h highlightRanges) Swap(i, j int) { h[i], h[j] = h[j], h[i] }
|
||||
func (h highlightRanges) Less(i, j int) bool { return h[i][0] < h[j][0] }
|
||||
|
||||
func (f *Formatter) Format(w io.Writer, style *chroma.Style, iterator chroma.Iterator) (err error) {
|
||||
return f.writeHTML(w, style, iterator.Tokens())
|
||||
}
|
||||
|
||||
// We deliberately don't use html/template here because it is two orders of magnitude slower (benchmarked).
|
||||
//
|
||||
// OTOH we need to be super careful about correct escaping...
|
||||
func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.Token) (err error) { // nolint: gocyclo
|
||||
css := f.styleToCSS(style)
|
||||
if !f.Classes {
|
||||
for t, style := range css {
|
||||
css[t] = compressStyle(style)
|
||||
}
|
||||
}
|
||||
if f.standalone {
|
||||
fmt.Fprint(w, "<html>\n")
|
||||
if f.Classes {
|
||||
fmt.Fprint(w, "<style type=\"text/css\">\n")
|
||||
err = f.WriteCSS(w, style)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
fmt.Fprintf(w, "body { %s; }\n", css[chroma.Background])
|
||||
fmt.Fprint(w, "</style>")
|
||||
}
|
||||
fmt.Fprintf(w, "<body%s>\n", f.styleAttr(css, chroma.Background))
|
||||
}
|
||||
|
||||
wrapInTable := f.lineNumbers && f.lineNumbersInTable
|
||||
|
||||
lines := chroma.SplitTokensIntoLines(tokens)
|
||||
lineDigits := len(fmt.Sprintf("%d", f.baseLineNumber+len(lines)-1))
|
||||
highlightIndex := 0
|
||||
|
||||
if wrapInTable {
|
||||
// List line numbers in its own <td>
|
||||
fmt.Fprintf(w, "<div%s>\n", f.styleAttr(css, chroma.Background))
|
||||
fmt.Fprintf(w, "<table%s><tr>", f.styleAttr(css, chroma.LineTable))
|
||||
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD))
|
||||
fmt.Fprintf(w, f.preWrapper.Start(false, f.styleAttr(css, chroma.Background)))
|
||||
for index := range lines {
|
||||
line := f.baseLineNumber + index
|
||||
highlight, next := f.shouldHighlight(highlightIndex, line)
|
||||
if next {
|
||||
highlightIndex++
|
||||
}
|
||||
if highlight {
|
||||
fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight))
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, "<span%s%s>%s\n</span>", f.styleAttr(css, chroma.LineNumbersTable), f.lineIDAttribute(line), f.lineTitleWithLinkIfNeeded(lineDigits, line))
|
||||
|
||||
if highlight {
|
||||
fmt.Fprintf(w, "</span>")
|
||||
}
|
||||
}
|
||||
fmt.Fprint(w, f.preWrapper.End(false))
|
||||
fmt.Fprint(w, "</td>\n")
|
||||
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD, "width:100%"))
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, f.preWrapper.Start(true, f.styleAttr(css, chroma.Background)))
|
||||
|
||||
highlightIndex = 0
|
||||
for index, tokens := range lines {
|
||||
// 1-based line number.
|
||||
line := f.baseLineNumber + index
|
||||
highlight, next := f.shouldHighlight(highlightIndex, line)
|
||||
if next {
|
||||
highlightIndex++
|
||||
}
|
||||
if highlight {
|
||||
fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight))
|
||||
}
|
||||
|
||||
if f.lineNumbers && !wrapInTable {
|
||||
fmt.Fprintf(w, "<span%s%s>%s</span>", f.styleAttr(css, chroma.LineNumbers), f.lineIDAttribute(line), f.lineTitleWithLinkIfNeeded(lineDigits, line))
|
||||
}
|
||||
|
||||
for _, token := range tokens {
|
||||
html := html.EscapeString(token.String())
|
||||
attr := f.styleAttr(css, token.Type)
|
||||
if attr != "" {
|
||||
html = fmt.Sprintf("<span%s>%s</span>", attr, html)
|
||||
}
|
||||
fmt.Fprint(w, html)
|
||||
}
|
||||
if highlight {
|
||||
fmt.Fprintf(w, "</span>")
|
||||
}
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, f.preWrapper.End(true))
|
||||
|
||||
if wrapInTable {
|
||||
fmt.Fprint(w, "</td></tr></table>\n")
|
||||
fmt.Fprint(w, "</div>\n")
|
||||
}
|
||||
|
||||
if f.standalone {
|
||||
fmt.Fprint(w, "\n</body>\n")
|
||||
fmt.Fprint(w, "</html>\n")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (f *Formatter) lineIDAttribute(line int) string {
|
||||
if !f.linkableLineNumbers {
|
||||
return ""
|
||||
}
|
||||
return fmt.Sprintf(" id=\"%s\"", f.lineID(line))
|
||||
}
|
||||
|
||||
func (f *Formatter) lineTitleWithLinkIfNeeded(lineDigits, line int) string {
|
||||
title := fmt.Sprintf("%*d", lineDigits, line)
|
||||
if !f.linkableLineNumbers {
|
||||
return title
|
||||
}
|
||||
return fmt.Sprintf("<a style=\"outline: none; text-decoration:none; color:inherit\" href=\"#%s\">%s</a>", f.lineID(line), title)
|
||||
}
|
||||
|
||||
func (f *Formatter) lineID(line int) string {
|
||||
return fmt.Sprintf("%s%d", f.lineNumbersIDPrefix, line)
|
||||
}
|
||||
|
||||
func (f *Formatter) shouldHighlight(highlightIndex, line int) (bool, bool) {
|
||||
next := false
|
||||
for highlightIndex < len(f.highlightRanges) && line > f.highlightRanges[highlightIndex][1] {
|
||||
highlightIndex++
|
||||
next = true
|
||||
}
|
||||
if highlightIndex < len(f.highlightRanges) {
|
||||
hrange := f.highlightRanges[highlightIndex]
|
||||
if line >= hrange[0] && line <= hrange[1] {
|
||||
return true, next
|
||||
}
|
||||
}
|
||||
return false, next
|
||||
}
|
||||
|
||||
func (f *Formatter) class(t chroma.TokenType) string {
|
||||
for t != 0 {
|
||||
if cls, ok := chroma.StandardTypes[t]; ok {
|
||||
if cls != "" {
|
||||
return f.prefix + cls
|
||||
}
|
||||
return ""
|
||||
}
|
||||
t = t.Parent()
|
||||
}
|
||||
if cls := chroma.StandardTypes[t]; cls != "" {
|
||||
return f.prefix + cls
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
func (f *Formatter) styleAttr(styles map[chroma.TokenType]string, tt chroma.TokenType, extraCSS ...string) string {
|
||||
if f.Classes {
|
||||
cls := f.class(tt)
|
||||
if cls == "" {
|
||||
return ""
|
||||
}
|
||||
return fmt.Sprintf(` class="%s"`, cls)
|
||||
}
|
||||
if _, ok := styles[tt]; !ok {
|
||||
tt = tt.SubCategory()
|
||||
if _, ok := styles[tt]; !ok {
|
||||
tt = tt.Category()
|
||||
if _, ok := styles[tt]; !ok {
|
||||
return ""
|
||||
}
|
||||
}
|
||||
}
|
||||
css := []string{styles[tt]}
|
||||
css = append(css, extraCSS...)
|
||||
return fmt.Sprintf(` style="%s"`, strings.Join(css, ";"))
|
||||
}
|
||||
|
||||
func (f *Formatter) tabWidthStyle() string {
|
||||
if f.tabWidth != 0 && f.tabWidth != 8 {
|
||||
return fmt.Sprintf("; -moz-tab-size: %[1]d; -o-tab-size: %[1]d; tab-size: %[1]d", f.tabWidth)
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
// WriteCSS writes CSS style definitions (without any surrounding HTML).
|
||||
func (f *Formatter) WriteCSS(w io.Writer, style *chroma.Style) error {
|
||||
css := f.styleToCSS(style)
|
||||
// Special-case background as it is mapped to the outer ".chroma" class.
|
||||
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma { %s }\n", chroma.Background, f.prefix, css[chroma.Background]); err != nil {
|
||||
return err
|
||||
}
|
||||
// Special-case code column of table to expand width.
|
||||
if f.lineNumbers && f.lineNumbersInTable {
|
||||
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma .%s:last-child { width: 100%%; }",
|
||||
chroma.LineTableTD, f.prefix, f.class(chroma.LineTableTD)); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
// Special-case line number highlighting when targeted.
|
||||
if f.lineNumbers || f.lineNumbersInTable {
|
||||
targetedLineCSS := StyleEntryToCSS(style.Get(chroma.LineHighlight))
|
||||
for _, tt := range []chroma.TokenType{chroma.LineNumbers, chroma.LineNumbersTable} {
|
||||
fmt.Fprintf(w, "/* %s targeted by URL anchor */ .%schroma .%s:target { %s }\n", tt, f.prefix, f.class(tt), targetedLineCSS)
|
||||
}
|
||||
}
|
||||
tts := []int{}
|
||||
for tt := range css {
|
||||
tts = append(tts, int(tt))
|
||||
}
|
||||
sort.Ints(tts)
|
||||
for _, ti := range tts {
|
||||
tt := chroma.TokenType(ti)
|
||||
if tt == chroma.Background {
|
||||
continue
|
||||
}
|
||||
class := f.class(tt)
|
||||
if class == "" {
|
||||
continue
|
||||
}
|
||||
styles := css[tt]
|
||||
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma .%s { %s }\n", tt, f.prefix, class, styles); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (f *Formatter) styleToCSS(style *chroma.Style) map[chroma.TokenType]string {
|
||||
classes := map[chroma.TokenType]string{}
|
||||
bg := style.Get(chroma.Background)
|
||||
// Convert the style.
|
||||
for t := range chroma.StandardTypes {
|
||||
entry := style.Get(t)
|
||||
if t != chroma.Background {
|
||||
entry = entry.Sub(bg)
|
||||
}
|
||||
if !f.allClasses && entry.IsZero() {
|
||||
continue
|
||||
}
|
||||
classes[t] = StyleEntryToCSS(entry)
|
||||
}
|
||||
classes[chroma.Background] += f.tabWidthStyle()
|
||||
lineNumbersStyle := "margin-right: 0.4em; padding: 0 0.4em 0 0.4em;"
|
||||
// All rules begin with default rules followed by user provided rules
|
||||
classes[chroma.LineNumbers] = lineNumbersStyle + classes[chroma.LineNumbers]
|
||||
classes[chroma.LineNumbersTable] = lineNumbersStyle + classes[chroma.LineNumbersTable]
|
||||
classes[chroma.LineHighlight] = "display: block; width: 100%;" + classes[chroma.LineHighlight]
|
||||
classes[chroma.LineTable] = "border-spacing: 0; padding: 0; margin: 0; border: 0; width: auto; overflow: auto; display: block;" + classes[chroma.LineTable]
|
||||
classes[chroma.LineTableTD] = "vertical-align: top; padding: 0; margin: 0; border: 0;" + classes[chroma.LineTableTD]
|
||||
return classes
|
||||
}
|
||||
|
||||
// StyleEntryToCSS converts a chroma.StyleEntry to CSS attributes.
|
||||
func StyleEntryToCSS(e chroma.StyleEntry) string {
|
||||
styles := []string{}
|
||||
if e.Colour.IsSet() {
|
||||
styles = append(styles, "color: "+e.Colour.String())
|
||||
}
|
||||
if e.Background.IsSet() {
|
||||
styles = append(styles, "background-color: "+e.Background.String())
|
||||
}
|
||||
if e.Bold == chroma.Yes {
|
||||
styles = append(styles, "font-weight: bold")
|
||||
}
|
||||
if e.Italic == chroma.Yes {
|
||||
styles = append(styles, "font-style: italic")
|
||||
}
|
||||
if e.Underline == chroma.Yes {
|
||||
styles = append(styles, "text-decoration: underline")
|
||||
}
|
||||
return strings.Join(styles, "; ")
|
||||
}
|
||||
|
||||
// Compress CSS attributes - remove spaces, transform 6-digit colours to 3.
|
||||
func compressStyle(s string) string {
|
||||
parts := strings.Split(s, ";")
|
||||
out := []string{}
|
||||
for _, p := range parts {
|
||||
p = strings.Join(strings.Fields(p), " ")
|
||||
p = strings.Replace(p, ": ", ":", 1)
|
||||
if strings.Contains(p, "#") {
|
||||
c := p[len(p)-6:]
|
||||
if c[0] == c[1] && c[2] == c[3] && c[4] == c[5] {
|
||||
p = p[:len(p)-6] + c[0:1] + c[2:3] + c[4:5]
|
||||
}
|
||||
}
|
||||
out = append(out, p)
|
||||
}
|
||||
return strings.Join(out, ";")
|
||||
}
|
31
vendor/github.com/alecthomas/chroma/formatters/json.go
generated
vendored
Normal file
31
vendor/github.com/alecthomas/chroma/formatters/json.go
generated
vendored
Normal file
@ -0,0 +1,31 @@
|
||||
package formatters
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"io"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// JSON formatter outputs the raw token structures as JSON.
|
||||
var JSON = Register("json", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, it chroma.Iterator) error {
|
||||
fmt.Fprintln(w, "[")
|
||||
i := 0
|
||||
for t := it(); t != chroma.EOF; t = it() {
|
||||
if i > 0 {
|
||||
fmt.Fprintln(w, ",")
|
||||
}
|
||||
i++
|
||||
bytes, err := json.Marshal(t)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if _, err := fmt.Fprint(w, " "+string(bytes)); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
fmt.Fprintln(w)
|
||||
fmt.Fprintln(w, "]")
|
||||
return nil
|
||||
}))
|
51
vendor/github.com/alecthomas/chroma/formatters/svg/font_liberation_mono.go
generated
vendored
Normal file
51
vendor/github.com/alecthomas/chroma/formatters/svg/font_liberation_mono.go
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
222
vendor/github.com/alecthomas/chroma/formatters/svg/svg.go
generated
vendored
Normal file
222
vendor/github.com/alecthomas/chroma/formatters/svg/svg.go
generated
vendored
Normal file
@ -0,0 +1,222 @@
|
||||
// Package svg contains an SVG formatter.
|
||||
package svg
|
||||
|
||||
import (
|
||||
"encoding/base64"
|
||||
"errors"
|
||||
"fmt"
|
||||
"io"
|
||||
"io/ioutil"
|
||||
"path"
|
||||
"strings"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// Option sets an option of the SVG formatter.
|
||||
type Option func(f *Formatter)
|
||||
|
||||
// FontFamily sets the font-family.
|
||||
func FontFamily(fontFamily string) Option { return func(f *Formatter) { f.fontFamily = fontFamily } }
|
||||
|
||||
// EmbedFontFile embeds given font file
|
||||
func EmbedFontFile(fontFamily string, fileName string) (option Option, err error) {
|
||||
var format FontFormat
|
||||
switch path.Ext(fileName) {
|
||||
case ".woff":
|
||||
format = WOFF
|
||||
case ".woff2":
|
||||
format = WOFF2
|
||||
case ".ttf":
|
||||
format = TRUETYPE
|
||||
default:
|
||||
return nil, errors.New("unexpected font file suffix")
|
||||
}
|
||||
|
||||
var content []byte
|
||||
if content, err = ioutil.ReadFile(fileName); err == nil {
|
||||
option = EmbedFont(fontFamily, base64.StdEncoding.EncodeToString(content), format)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
// EmbedFont embeds given base64 encoded font
|
||||
func EmbedFont(fontFamily string, font string, format FontFormat) Option {
|
||||
return func(f *Formatter) { f.fontFamily = fontFamily; f.embeddedFont = font; f.fontFormat = format }
|
||||
}
|
||||
|
||||
// New SVG formatter.
|
||||
func New(options ...Option) *Formatter {
|
||||
f := &Formatter{fontFamily: "Consolas, Monaco, Lucida Console, Liberation Mono, DejaVu Sans Mono, Bitstream Vera Sans Mono, Courier New, monospace"}
|
||||
for _, option := range options {
|
||||
option(f)
|
||||
}
|
||||
return f
|
||||
}
|
||||
|
||||
// Formatter that generates SVG.
|
||||
type Formatter struct {
|
||||
fontFamily string
|
||||
embeddedFont string
|
||||
fontFormat FontFormat
|
||||
}
|
||||
|
||||
func (f *Formatter) Format(w io.Writer, style *chroma.Style, iterator chroma.Iterator) (err error) {
|
||||
f.writeSVG(w, style, iterator.Tokens())
|
||||
return err
|
||||
}
|
||||
|
||||
var svgEscaper = strings.NewReplacer(
|
||||
`&`, "&",
|
||||
`<`, "<",
|
||||
`>`, ">",
|
||||
`"`, """,
|
||||
` `, " ",
|
||||
` `, "    ",
|
||||
)
|
||||
|
||||
// EscapeString escapes special characters.
|
||||
func escapeString(s string) string {
|
||||
return svgEscaper.Replace(s)
|
||||
}
|
||||
|
||||
func (f *Formatter) writeSVG(w io.Writer, style *chroma.Style, tokens []chroma.Token) { // nolint: gocyclo
|
||||
svgStyles := f.styleToSVG(style)
|
||||
lines := chroma.SplitTokensIntoLines(tokens)
|
||||
|
||||
fmt.Fprint(w, "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n")
|
||||
fmt.Fprint(w, "<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.0//EN\" \"http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd\">\n")
|
||||
fmt.Fprintf(w, "<svg width=\"%dpx\" height=\"%dpx\" xmlns=\"http://www.w3.org/2000/svg\">\n", 8*maxLineWidth(lines), 10+int(16.8*float64(len(lines)+1)))
|
||||
|
||||
if f.embeddedFont != "" {
|
||||
f.writeFontStyle(w)
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, "<rect width=\"100%%\" height=\"100%%\" fill=\"%s\"/>\n", style.Get(chroma.Background).Background.String())
|
||||
fmt.Fprintf(w, "<g font-family=\"%s\" font-size=\"14px\" fill=\"%s\">\n", f.fontFamily, style.Get(chroma.Text).Colour.String())
|
||||
|
||||
f.writeTokenBackgrounds(w, lines, style)
|
||||
|
||||
for index, tokens := range lines {
|
||||
fmt.Fprintf(w, "<text x=\"0\" y=\"%fem\" xml:space=\"preserve\">", 1.2*float64(index+1))
|
||||
|
||||
for _, token := range tokens {
|
||||
text := escapeString(token.String())
|
||||
attr := f.styleAttr(svgStyles, token.Type)
|
||||
if attr != "" {
|
||||
text = fmt.Sprintf("<tspan %s>%s</tspan>", attr, text)
|
||||
}
|
||||
fmt.Fprint(w, text)
|
||||
}
|
||||
fmt.Fprint(w, "</text>")
|
||||
}
|
||||
|
||||
fmt.Fprint(w, "\n</g>\n")
|
||||
fmt.Fprint(w, "</svg>\n")
|
||||
}
|
||||
|
||||
func maxLineWidth(lines [][]chroma.Token) int {
|
||||
maxWidth := 0
|
||||
for _, tokens := range lines {
|
||||
length := 0
|
||||
for _, token := range tokens {
|
||||
length += len(strings.Replace(token.String(), ` `, " ", -1))
|
||||
}
|
||||
if length > maxWidth {
|
||||
maxWidth = length
|
||||
}
|
||||
}
|
||||
return maxWidth
|
||||
}
|
||||
|
||||
// There is no background attribute for text in SVG so simply calculate the position and text
|
||||
// of tokens with a background color that differs from the default and add a rectangle for each before
|
||||
// adding the token.
|
||||
func (f *Formatter) writeTokenBackgrounds(w io.Writer, lines [][]chroma.Token, style *chroma.Style) {
|
||||
for index, tokens := range lines {
|
||||
lineLength := 0
|
||||
for _, token := range tokens {
|
||||
length := len(strings.Replace(token.String(), ` `, " ", -1))
|
||||
tokenBackground := style.Get(token.Type).Background
|
||||
if tokenBackground.IsSet() && tokenBackground != style.Get(chroma.Background).Background {
|
||||
fmt.Fprintf(w, "<rect id=\"%s\" x=\"%dch\" y=\"%fem\" width=\"%dch\" height=\"1.2em\" fill=\"%s\" />\n", escapeString(token.String()), lineLength, 1.2*float64(index)+0.25, length, style.Get(token.Type).Background.String())
|
||||
}
|
||||
lineLength += length
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
type FontFormat int
|
||||
|
||||
// https://transfonter.org/formats
|
||||
const (
|
||||
WOFF FontFormat = iota
|
||||
WOFF2
|
||||
TRUETYPE
|
||||
)
|
||||
|
||||
var fontFormats = [...]string{
|
||||
"woff",
|
||||
"woff2",
|
||||
"truetype",
|
||||
}
|
||||
|
||||
func (f *Formatter) writeFontStyle(w io.Writer) {
|
||||
fmt.Fprintf(w, `<style>
|
||||
@font-face {
|
||||
font-family: '%s';
|
||||
src: url(data:application/x-font-%s;charset=utf-8;base64,%s) format('%s');'
|
||||
font-weight: normal;
|
||||
font-style: normal;
|
||||
}
|
||||
</style>`, f.fontFamily, fontFormats[f.fontFormat], f.embeddedFont, fontFormats[f.fontFormat])
|
||||
}
|
||||
|
||||
func (f *Formatter) styleAttr(styles map[chroma.TokenType]string, tt chroma.TokenType) string {
|
||||
if _, ok := styles[tt]; !ok {
|
||||
tt = tt.SubCategory()
|
||||
if _, ok := styles[tt]; !ok {
|
||||
tt = tt.Category()
|
||||
if _, ok := styles[tt]; !ok {
|
||||
return ""
|
||||
}
|
||||
}
|
||||
}
|
||||
return styles[tt]
|
||||
}
|
||||
|
||||
func (f *Formatter) styleToSVG(style *chroma.Style) map[chroma.TokenType]string {
|
||||
converted := map[chroma.TokenType]string{}
|
||||
bg := style.Get(chroma.Background)
|
||||
// Convert the style.
|
||||
for t := range chroma.StandardTypes {
|
||||
entry := style.Get(t)
|
||||
if t != chroma.Background {
|
||||
entry = entry.Sub(bg)
|
||||
}
|
||||
if entry.IsZero() {
|
||||
continue
|
||||
}
|
||||
converted[t] = StyleEntryToSVG(entry)
|
||||
}
|
||||
return converted
|
||||
}
|
||||
|
||||
// StyleEntryToSVG converts a chroma.StyleEntry to SVG attributes.
|
||||
func StyleEntryToSVG(e chroma.StyleEntry) string {
|
||||
var styles []string
|
||||
|
||||
if e.Colour.IsSet() {
|
||||
styles = append(styles, "fill=\""+e.Colour.String()+"\"")
|
||||
}
|
||||
if e.Bold == chroma.Yes {
|
||||
styles = append(styles, "font-weight=\"bold\"")
|
||||
}
|
||||
if e.Italic == chroma.Yes {
|
||||
styles = append(styles, "font-style=\"italic\"")
|
||||
}
|
||||
if e.Underline == chroma.Yes {
|
||||
styles = append(styles, "text-decoration=\"underline\"")
|
||||
}
|
||||
return strings.Join(styles, " ")
|
||||
}
|
18
vendor/github.com/alecthomas/chroma/formatters/tokens.go
generated
vendored
Normal file
18
vendor/github.com/alecthomas/chroma/formatters/tokens.go
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
package formatters
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// Tokens formatter outputs the raw token structures.
|
||||
var Tokens = Register("tokens", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, it chroma.Iterator) error {
|
||||
for t := it(); t != chroma.EOF; t = it() {
|
||||
if _, err := fmt.Fprintln(w, t.GoString()); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}))
|
282
vendor/github.com/alecthomas/chroma/formatters/tty_indexed.go
generated
vendored
Normal file
282
vendor/github.com/alecthomas/chroma/formatters/tty_indexed.go
generated
vendored
Normal file
@ -0,0 +1,282 @@
|
||||
package formatters
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"math"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
type ttyTable struct {
|
||||
foreground map[chroma.Colour]string
|
||||
background map[chroma.Colour]string
|
||||
}
|
||||
|
||||
var c = chroma.MustParseColour
|
||||
|
||||
var ttyTables = map[int]*ttyTable{
|
||||
8: {
|
||||
foreground: map[chroma.Colour]string{
|
||||
c("#000000"): "\033[30m", c("#7f0000"): "\033[31m", c("#007f00"): "\033[32m", c("#7f7fe0"): "\033[33m",
|
||||
c("#00007f"): "\033[34m", c("#7f007f"): "\033[35m", c("#007f7f"): "\033[36m", c("#e5e5e5"): "\033[37m",
|
||||
c("#555555"): "\033[1m\033[30m", c("#ff0000"): "\033[1m\033[31m", c("#00ff00"): "\033[1m\033[32m", c("#ffff00"): "\033[1m\033[33m",
|
||||
c("#0000ff"): "\033[1m\033[34m", c("#ff00ff"): "\033[1m\033[35m", c("#00ffff"): "\033[1m\033[36m", c("#ffffff"): "\033[1m\033[37m",
|
||||
},
|
||||
background: map[chroma.Colour]string{
|
||||
c("#000000"): "\033[40m", c("#7f0000"): "\033[41m", c("#007f00"): "\033[42m", c("#7f7fe0"): "\033[43m",
|
||||
c("#00007f"): "\033[44m", c("#7f007f"): "\033[45m", c("#007f7f"): "\033[46m", c("#e5e5e5"): "\033[47m",
|
||||
c("#555555"): "\033[1m\033[40m", c("#ff0000"): "\033[1m\033[41m", c("#00ff00"): "\033[1m\033[42m", c("#ffff00"): "\033[1m\033[43m",
|
||||
c("#0000ff"): "\033[1m\033[44m", c("#ff00ff"): "\033[1m\033[45m", c("#00ffff"): "\033[1m\033[46m", c("#ffffff"): "\033[1m\033[47m",
|
||||
},
|
||||
},
|
||||
16: {
|
||||
foreground: map[chroma.Colour]string{
|
||||
c("#000000"): "\033[30m", c("#7f0000"): "\033[31m", c("#007f00"): "\033[32m", c("#7f7fe0"): "\033[33m",
|
||||
c("#00007f"): "\033[34m", c("#7f007f"): "\033[35m", c("#007f7f"): "\033[36m", c("#e5e5e5"): "\033[37m",
|
||||
c("#555555"): "\033[90m", c("#ff0000"): "\033[91m", c("#00ff00"): "\033[92m", c("#ffff00"): "\033[93m",
|
||||
c("#0000ff"): "\033[94m", c("#ff00ff"): "\033[95m", c("#00ffff"): "\033[96m", c("#ffffff"): "\033[97m",
|
||||
},
|
||||
background: map[chroma.Colour]string{
|
||||
c("#000000"): "\033[40m", c("#7f0000"): "\033[41m", c("#007f00"): "\033[42m", c("#7f7fe0"): "\033[43m",
|
||||
c("#00007f"): "\033[44m", c("#7f007f"): "\033[45m", c("#007f7f"): "\033[46m", c("#e5e5e5"): "\033[47m",
|
||||
c("#555555"): "\033[100m", c("#ff0000"): "\033[101m", c("#00ff00"): "\033[102m", c("#ffff00"): "\033[103m",
|
||||
c("#0000ff"): "\033[104m", c("#ff00ff"): "\033[105m", c("#00ffff"): "\033[106m", c("#ffffff"): "\033[107m",
|
||||
},
|
||||
},
|
||||
256: {
|
||||
foreground: map[chroma.Colour]string{
|
||||
c("#000000"): "\033[38;5;0m", c("#800000"): "\033[38;5;1m", c("#008000"): "\033[38;5;2m", c("#808000"): "\033[38;5;3m",
|
||||
c("#000080"): "\033[38;5;4m", c("#800080"): "\033[38;5;5m", c("#008080"): "\033[38;5;6m", c("#c0c0c0"): "\033[38;5;7m",
|
||||
c("#808080"): "\033[38;5;8m", c("#ff0000"): "\033[38;5;9m", c("#00ff00"): "\033[38;5;10m", c("#ffff00"): "\033[38;5;11m",
|
||||
c("#0000ff"): "\033[38;5;12m", c("#ff00ff"): "\033[38;5;13m", c("#00ffff"): "\033[38;5;14m", c("#ffffff"): "\033[38;5;15m",
|
||||
c("#000000"): "\033[38;5;16m", c("#00005f"): "\033[38;5;17m", c("#000087"): "\033[38;5;18m", c("#0000af"): "\033[38;5;19m",
|
||||
c("#0000d7"): "\033[38;5;20m", c("#0000ff"): "\033[38;5;21m", c("#005f00"): "\033[38;5;22m", c("#005f5f"): "\033[38;5;23m",
|
||||
c("#005f87"): "\033[38;5;24m", c("#005faf"): "\033[38;5;25m", c("#005fd7"): "\033[38;5;26m", c("#005fff"): "\033[38;5;27m",
|
||||
c("#008700"): "\033[38;5;28m", c("#00875f"): "\033[38;5;29m", c("#008787"): "\033[38;5;30m", c("#0087af"): "\033[38;5;31m",
|
||||
c("#0087d7"): "\033[38;5;32m", c("#0087ff"): "\033[38;5;33m", c("#00af00"): "\033[38;5;34m", c("#00af5f"): "\033[38;5;35m",
|
||||
c("#00af87"): "\033[38;5;36m", c("#00afaf"): "\033[38;5;37m", c("#00afd7"): "\033[38;5;38m", c("#00afff"): "\033[38;5;39m",
|
||||
c("#00d700"): "\033[38;5;40m", c("#00d75f"): "\033[38;5;41m", c("#00d787"): "\033[38;5;42m", c("#00d7af"): "\033[38;5;43m",
|
||||
c("#00d7d7"): "\033[38;5;44m", c("#00d7ff"): "\033[38;5;45m", c("#00ff00"): "\033[38;5;46m", c("#00ff5f"): "\033[38;5;47m",
|
||||
c("#00ff87"): "\033[38;5;48m", c("#00ffaf"): "\033[38;5;49m", c("#00ffd7"): "\033[38;5;50m", c("#00ffff"): "\033[38;5;51m",
|
||||
c("#5f0000"): "\033[38;5;52m", c("#5f005f"): "\033[38;5;53m", c("#5f0087"): "\033[38;5;54m", c("#5f00af"): "\033[38;5;55m",
|
||||
c("#5f00d7"): "\033[38;5;56m", c("#5f00ff"): "\033[38;5;57m", c("#5f5f00"): "\033[38;5;58m", c("#5f5f5f"): "\033[38;5;59m",
|
||||
c("#5f5f87"): "\033[38;5;60m", c("#5f5faf"): "\033[38;5;61m", c("#5f5fd7"): "\033[38;5;62m", c("#5f5fff"): "\033[38;5;63m",
|
||||
c("#5f8700"): "\033[38;5;64m", c("#5f875f"): "\033[38;5;65m", c("#5f8787"): "\033[38;5;66m", c("#5f87af"): "\033[38;5;67m",
|
||||
c("#5f87d7"): "\033[38;5;68m", c("#5f87ff"): "\033[38;5;69m", c("#5faf00"): "\033[38;5;70m", c("#5faf5f"): "\033[38;5;71m",
|
||||
c("#5faf87"): "\033[38;5;72m", c("#5fafaf"): "\033[38;5;73m", c("#5fafd7"): "\033[38;5;74m", c("#5fafff"): "\033[38;5;75m",
|
||||
c("#5fd700"): "\033[38;5;76m", c("#5fd75f"): "\033[38;5;77m", c("#5fd787"): "\033[38;5;78m", c("#5fd7af"): "\033[38;5;79m",
|
||||
c("#5fd7d7"): "\033[38;5;80m", c("#5fd7ff"): "\033[38;5;81m", c("#5fff00"): "\033[38;5;82m", c("#5fff5f"): "\033[38;5;83m",
|
||||
c("#5fff87"): "\033[38;5;84m", c("#5fffaf"): "\033[38;5;85m", c("#5fffd7"): "\033[38;5;86m", c("#5fffff"): "\033[38;5;87m",
|
||||
c("#870000"): "\033[38;5;88m", c("#87005f"): "\033[38;5;89m", c("#870087"): "\033[38;5;90m", c("#8700af"): "\033[38;5;91m",
|
||||
c("#8700d7"): "\033[38;5;92m", c("#8700ff"): "\033[38;5;93m", c("#875f00"): "\033[38;5;94m", c("#875f5f"): "\033[38;5;95m",
|
||||
c("#875f87"): "\033[38;5;96m", c("#875faf"): "\033[38;5;97m", c("#875fd7"): "\033[38;5;98m", c("#875fff"): "\033[38;5;99m",
|
||||
c("#878700"): "\033[38;5;100m", c("#87875f"): "\033[38;5;101m", c("#878787"): "\033[38;5;102m", c("#8787af"): "\033[38;5;103m",
|
||||
c("#8787d7"): "\033[38;5;104m", c("#8787ff"): "\033[38;5;105m", c("#87af00"): "\033[38;5;106m", c("#87af5f"): "\033[38;5;107m",
|
||||
c("#87af87"): "\033[38;5;108m", c("#87afaf"): "\033[38;5;109m", c("#87afd7"): "\033[38;5;110m", c("#87afff"): "\033[38;5;111m",
|
||||
c("#87d700"): "\033[38;5;112m", c("#87d75f"): "\033[38;5;113m", c("#87d787"): "\033[38;5;114m", c("#87d7af"): "\033[38;5;115m",
|
||||
c("#87d7d7"): "\033[38;5;116m", c("#87d7ff"): "\033[38;5;117m", c("#87ff00"): "\033[38;5;118m", c("#87ff5f"): "\033[38;5;119m",
|
||||
c("#87ff87"): "\033[38;5;120m", c("#87ffaf"): "\033[38;5;121m", c("#87ffd7"): "\033[38;5;122m", c("#87ffff"): "\033[38;5;123m",
|
||||
c("#af0000"): "\033[38;5;124m", c("#af005f"): "\033[38;5;125m", c("#af0087"): "\033[38;5;126m", c("#af00af"): "\033[38;5;127m",
|
||||
c("#af00d7"): "\033[38;5;128m", c("#af00ff"): "\033[38;5;129m", c("#af5f00"): "\033[38;5;130m", c("#af5f5f"): "\033[38;5;131m",
|
||||
c("#af5f87"): "\033[38;5;132m", c("#af5faf"): "\033[38;5;133m", c("#af5fd7"): "\033[38;5;134m", c("#af5fff"): "\033[38;5;135m",
|
||||
c("#af8700"): "\033[38;5;136m", c("#af875f"): "\033[38;5;137m", c("#af8787"): "\033[38;5;138m", c("#af87af"): "\033[38;5;139m",
|
||||
c("#af87d7"): "\033[38;5;140m", c("#af87ff"): "\033[38;5;141m", c("#afaf00"): "\033[38;5;142m", c("#afaf5f"): "\033[38;5;143m",
|
||||
c("#afaf87"): "\033[38;5;144m", c("#afafaf"): "\033[38;5;145m", c("#afafd7"): "\033[38;5;146m", c("#afafff"): "\033[38;5;147m",
|
||||
c("#afd700"): "\033[38;5;148m", c("#afd75f"): "\033[38;5;149m", c("#afd787"): "\033[38;5;150m", c("#afd7af"): "\033[38;5;151m",
|
||||
c("#afd7d7"): "\033[38;5;152m", c("#afd7ff"): "\033[38;5;153m", c("#afff00"): "\033[38;5;154m", c("#afff5f"): "\033[38;5;155m",
|
||||
c("#afff87"): "\033[38;5;156m", c("#afffaf"): "\033[38;5;157m", c("#afffd7"): "\033[38;5;158m", c("#afffff"): "\033[38;5;159m",
|
||||
c("#d70000"): "\033[38;5;160m", c("#d7005f"): "\033[38;5;161m", c("#d70087"): "\033[38;5;162m", c("#d700af"): "\033[38;5;163m",
|
||||
c("#d700d7"): "\033[38;5;164m", c("#d700ff"): "\033[38;5;165m", c("#d75f00"): "\033[38;5;166m", c("#d75f5f"): "\033[38;5;167m",
|
||||
c("#d75f87"): "\033[38;5;168m", c("#d75faf"): "\033[38;5;169m", c("#d75fd7"): "\033[38;5;170m", c("#d75fff"): "\033[38;5;171m",
|
||||
c("#d78700"): "\033[38;5;172m", c("#d7875f"): "\033[38;5;173m", c("#d78787"): "\033[38;5;174m", c("#d787af"): "\033[38;5;175m",
|
||||
c("#d787d7"): "\033[38;5;176m", c("#d787ff"): "\033[38;5;177m", c("#d7af00"): "\033[38;5;178m", c("#d7af5f"): "\033[38;5;179m",
|
||||
c("#d7af87"): "\033[38;5;180m", c("#d7afaf"): "\033[38;5;181m", c("#d7afd7"): "\033[38;5;182m", c("#d7afff"): "\033[38;5;183m",
|
||||
c("#d7d700"): "\033[38;5;184m", c("#d7d75f"): "\033[38;5;185m", c("#d7d787"): "\033[38;5;186m", c("#d7d7af"): "\033[38;5;187m",
|
||||
c("#d7d7d7"): "\033[38;5;188m", c("#d7d7ff"): "\033[38;5;189m", c("#d7ff00"): "\033[38;5;190m", c("#d7ff5f"): "\033[38;5;191m",
|
||||
c("#d7ff87"): "\033[38;5;192m", c("#d7ffaf"): "\033[38;5;193m", c("#d7ffd7"): "\033[38;5;194m", c("#d7ffff"): "\033[38;5;195m",
|
||||
c("#ff0000"): "\033[38;5;196m", c("#ff005f"): "\033[38;5;197m", c("#ff0087"): "\033[38;5;198m", c("#ff00af"): "\033[38;5;199m",
|
||||
c("#ff00d7"): "\033[38;5;200m", c("#ff00ff"): "\033[38;5;201m", c("#ff5f00"): "\033[38;5;202m", c("#ff5f5f"): "\033[38;5;203m",
|
||||
c("#ff5f87"): "\033[38;5;204m", c("#ff5faf"): "\033[38;5;205m", c("#ff5fd7"): "\033[38;5;206m", c("#ff5fff"): "\033[38;5;207m",
|
||||
c("#ff8700"): "\033[38;5;208m", c("#ff875f"): "\033[38;5;209m", c("#ff8787"): "\033[38;5;210m", c("#ff87af"): "\033[38;5;211m",
|
||||
c("#ff87d7"): "\033[38;5;212m", c("#ff87ff"): "\033[38;5;213m", c("#ffaf00"): "\033[38;5;214m", c("#ffaf5f"): "\033[38;5;215m",
|
||||
c("#ffaf87"): "\033[38;5;216m", c("#ffafaf"): "\033[38;5;217m", c("#ffafd7"): "\033[38;5;218m", c("#ffafff"): "\033[38;5;219m",
|
||||
c("#ffd700"): "\033[38;5;220m", c("#ffd75f"): "\033[38;5;221m", c("#ffd787"): "\033[38;5;222m", c("#ffd7af"): "\033[38;5;223m",
|
||||
c("#ffd7d7"): "\033[38;5;224m", c("#ffd7ff"): "\033[38;5;225m", c("#ffff00"): "\033[38;5;226m", c("#ffff5f"): "\033[38;5;227m",
|
||||
c("#ffff87"): "\033[38;5;228m", c("#ffffaf"): "\033[38;5;229m", c("#ffffd7"): "\033[38;5;230m", c("#ffffff"): "\033[38;5;231m",
|
||||
c("#080808"): "\033[38;5;232m", c("#121212"): "\033[38;5;233m", c("#1c1c1c"): "\033[38;5;234m", c("#262626"): "\033[38;5;235m",
|
||||
c("#303030"): "\033[38;5;236m", c("#3a3a3a"): "\033[38;5;237m", c("#444444"): "\033[38;5;238m", c("#4e4e4e"): "\033[38;5;239m",
|
||||
c("#585858"): "\033[38;5;240m", c("#626262"): "\033[38;5;241m", c("#6c6c6c"): "\033[38;5;242m", c("#767676"): "\033[38;5;243m",
|
||||
c("#808080"): "\033[38;5;244m", c("#8a8a8a"): "\033[38;5;245m", c("#949494"): "\033[38;5;246m", c("#9e9e9e"): "\033[38;5;247m",
|
||||
c("#a8a8a8"): "\033[38;5;248m", c("#b2b2b2"): "\033[38;5;249m", c("#bcbcbc"): "\033[38;5;250m", c("#c6c6c6"): "\033[38;5;251m",
|
||||
c("#d0d0d0"): "\033[38;5;252m", c("#dadada"): "\033[38;5;253m", c("#e4e4e4"): "\033[38;5;254m", c("#eeeeee"): "\033[38;5;255m",
|
||||
},
|
||||
background: map[chroma.Colour]string{
|
||||
c("#000000"): "\033[48;5;0m", c("#800000"): "\033[48;5;1m", c("#008000"): "\033[48;5;2m", c("#808000"): "\033[48;5;3m",
|
||||
c("#000080"): "\033[48;5;4m", c("#800080"): "\033[48;5;5m", c("#008080"): "\033[48;5;6m", c("#c0c0c0"): "\033[48;5;7m",
|
||||
c("#808080"): "\033[48;5;8m", c("#ff0000"): "\033[48;5;9m", c("#00ff00"): "\033[48;5;10m", c("#ffff00"): "\033[48;5;11m",
|
||||
c("#0000ff"): "\033[48;5;12m", c("#ff00ff"): "\033[48;5;13m", c("#00ffff"): "\033[48;5;14m", c("#ffffff"): "\033[48;5;15m",
|
||||
c("#000000"): "\033[48;5;16m", c("#00005f"): "\033[48;5;17m", c("#000087"): "\033[48;5;18m", c("#0000af"): "\033[48;5;19m",
|
||||
c("#0000d7"): "\033[48;5;20m", c("#0000ff"): "\033[48;5;21m", c("#005f00"): "\033[48;5;22m", c("#005f5f"): "\033[48;5;23m",
|
||||
c("#005f87"): "\033[48;5;24m", c("#005faf"): "\033[48;5;25m", c("#005fd7"): "\033[48;5;26m", c("#005fff"): "\033[48;5;27m",
|
||||
c("#008700"): "\033[48;5;28m", c("#00875f"): "\033[48;5;29m", c("#008787"): "\033[48;5;30m", c("#0087af"): "\033[48;5;31m",
|
||||
c("#0087d7"): "\033[48;5;32m", c("#0087ff"): "\033[48;5;33m", c("#00af00"): "\033[48;5;34m", c("#00af5f"): "\033[48;5;35m",
|
||||
c("#00af87"): "\033[48;5;36m", c("#00afaf"): "\033[48;5;37m", c("#00afd7"): "\033[48;5;38m", c("#00afff"): "\033[48;5;39m",
|
||||
c("#00d700"): "\033[48;5;40m", c("#00d75f"): "\033[48;5;41m", c("#00d787"): "\033[48;5;42m", c("#00d7af"): "\033[48;5;43m",
|
||||
c("#00d7d7"): "\033[48;5;44m", c("#00d7ff"): "\033[48;5;45m", c("#00ff00"): "\033[48;5;46m", c("#00ff5f"): "\033[48;5;47m",
|
||||
c("#00ff87"): "\033[48;5;48m", c("#00ffaf"): "\033[48;5;49m", c("#00ffd7"): "\033[48;5;50m", c("#00ffff"): "\033[48;5;51m",
|
||||
c("#5f0000"): "\033[48;5;52m", c("#5f005f"): "\033[48;5;53m", c("#5f0087"): "\033[48;5;54m", c("#5f00af"): "\033[48;5;55m",
|
||||
c("#5f00d7"): "\033[48;5;56m", c("#5f00ff"): "\033[48;5;57m", c("#5f5f00"): "\033[48;5;58m", c("#5f5f5f"): "\033[48;5;59m",
|
||||
c("#5f5f87"): "\033[48;5;60m", c("#5f5faf"): "\033[48;5;61m", c("#5f5fd7"): "\033[48;5;62m", c("#5f5fff"): "\033[48;5;63m",
|
||||
c("#5f8700"): "\033[48;5;64m", c("#5f875f"): "\033[48;5;65m", c("#5f8787"): "\033[48;5;66m", c("#5f87af"): "\033[48;5;67m",
|
||||
c("#5f87d7"): "\033[48;5;68m", c("#5f87ff"): "\033[48;5;69m", c("#5faf00"): "\033[48;5;70m", c("#5faf5f"): "\033[48;5;71m",
|
||||
c("#5faf87"): "\033[48;5;72m", c("#5fafaf"): "\033[48;5;73m", c("#5fafd7"): "\033[48;5;74m", c("#5fafff"): "\033[48;5;75m",
|
||||
c("#5fd700"): "\033[48;5;76m", c("#5fd75f"): "\033[48;5;77m", c("#5fd787"): "\033[48;5;78m", c("#5fd7af"): "\033[48;5;79m",
|
||||
c("#5fd7d7"): "\033[48;5;80m", c("#5fd7ff"): "\033[48;5;81m", c("#5fff00"): "\033[48;5;82m", c("#5fff5f"): "\033[48;5;83m",
|
||||
c("#5fff87"): "\033[48;5;84m", c("#5fffaf"): "\033[48;5;85m", c("#5fffd7"): "\033[48;5;86m", c("#5fffff"): "\033[48;5;87m",
|
||||
c("#870000"): "\033[48;5;88m", c("#87005f"): "\033[48;5;89m", c("#870087"): "\033[48;5;90m", c("#8700af"): "\033[48;5;91m",
|
||||
c("#8700d7"): "\033[48;5;92m", c("#8700ff"): "\033[48;5;93m", c("#875f00"): "\033[48;5;94m", c("#875f5f"): "\033[48;5;95m",
|
||||
c("#875f87"): "\033[48;5;96m", c("#875faf"): "\033[48;5;97m", c("#875fd7"): "\033[48;5;98m", c("#875fff"): "\033[48;5;99m",
|
||||
c("#878700"): "\033[48;5;100m", c("#87875f"): "\033[48;5;101m", c("#878787"): "\033[48;5;102m", c("#8787af"): "\033[48;5;103m",
|
||||
c("#8787d7"): "\033[48;5;104m", c("#8787ff"): "\033[48;5;105m", c("#87af00"): "\033[48;5;106m", c("#87af5f"): "\033[48;5;107m",
|
||||
c("#87af87"): "\033[48;5;108m", c("#87afaf"): "\033[48;5;109m", c("#87afd7"): "\033[48;5;110m", c("#87afff"): "\033[48;5;111m",
|
||||
c("#87d700"): "\033[48;5;112m", c("#87d75f"): "\033[48;5;113m", c("#87d787"): "\033[48;5;114m", c("#87d7af"): "\033[48;5;115m",
|
||||
c("#87d7d7"): "\033[48;5;116m", c("#87d7ff"): "\033[48;5;117m", c("#87ff00"): "\033[48;5;118m", c("#87ff5f"): "\033[48;5;119m",
|
||||
c("#87ff87"): "\033[48;5;120m", c("#87ffaf"): "\033[48;5;121m", c("#87ffd7"): "\033[48;5;122m", c("#87ffff"): "\033[48;5;123m",
|
||||
c("#af0000"): "\033[48;5;124m", c("#af005f"): "\033[48;5;125m", c("#af0087"): "\033[48;5;126m", c("#af00af"): "\033[48;5;127m",
|
||||
c("#af00d7"): "\033[48;5;128m", c("#af00ff"): "\033[48;5;129m", c("#af5f00"): "\033[48;5;130m", c("#af5f5f"): "\033[48;5;131m",
|
||||
c("#af5f87"): "\033[48;5;132m", c("#af5faf"): "\033[48;5;133m", c("#af5fd7"): "\033[48;5;134m", c("#af5fff"): "\033[48;5;135m",
|
||||
c("#af8700"): "\033[48;5;136m", c("#af875f"): "\033[48;5;137m", c("#af8787"): "\033[48;5;138m", c("#af87af"): "\033[48;5;139m",
|
||||
c("#af87d7"): "\033[48;5;140m", c("#af87ff"): "\033[48;5;141m", c("#afaf00"): "\033[48;5;142m", c("#afaf5f"): "\033[48;5;143m",
|
||||
c("#afaf87"): "\033[48;5;144m", c("#afafaf"): "\033[48;5;145m", c("#afafd7"): "\033[48;5;146m", c("#afafff"): "\033[48;5;147m",
|
||||
c("#afd700"): "\033[48;5;148m", c("#afd75f"): "\033[48;5;149m", c("#afd787"): "\033[48;5;150m", c("#afd7af"): "\033[48;5;151m",
|
||||
c("#afd7d7"): "\033[48;5;152m", c("#afd7ff"): "\033[48;5;153m", c("#afff00"): "\033[48;5;154m", c("#afff5f"): "\033[48;5;155m",
|
||||
c("#afff87"): "\033[48;5;156m", c("#afffaf"): "\033[48;5;157m", c("#afffd7"): "\033[48;5;158m", c("#afffff"): "\033[48;5;159m",
|
||||
c("#d70000"): "\033[48;5;160m", c("#d7005f"): "\033[48;5;161m", c("#d70087"): "\033[48;5;162m", c("#d700af"): "\033[48;5;163m",
|
||||
c("#d700d7"): "\033[48;5;164m", c("#d700ff"): "\033[48;5;165m", c("#d75f00"): "\033[48;5;166m", c("#d75f5f"): "\033[48;5;167m",
|
||||
c("#d75f87"): "\033[48;5;168m", c("#d75faf"): "\033[48;5;169m", c("#d75fd7"): "\033[48;5;170m", c("#d75fff"): "\033[48;5;171m",
|
||||
c("#d78700"): "\033[48;5;172m", c("#d7875f"): "\033[48;5;173m", c("#d78787"): "\033[48;5;174m", c("#d787af"): "\033[48;5;175m",
|
||||
c("#d787d7"): "\033[48;5;176m", c("#d787ff"): "\033[48;5;177m", c("#d7af00"): "\033[48;5;178m", c("#d7af5f"): "\033[48;5;179m",
|
||||
c("#d7af87"): "\033[48;5;180m", c("#d7afaf"): "\033[48;5;181m", c("#d7afd7"): "\033[48;5;182m", c("#d7afff"): "\033[48;5;183m",
|
||||
c("#d7d700"): "\033[48;5;184m", c("#d7d75f"): "\033[48;5;185m", c("#d7d787"): "\033[48;5;186m", c("#d7d7af"): "\033[48;5;187m",
|
||||
c("#d7d7d7"): "\033[48;5;188m", c("#d7d7ff"): "\033[48;5;189m", c("#d7ff00"): "\033[48;5;190m", c("#d7ff5f"): "\033[48;5;191m",
|
||||
c("#d7ff87"): "\033[48;5;192m", c("#d7ffaf"): "\033[48;5;193m", c("#d7ffd7"): "\033[48;5;194m", c("#d7ffff"): "\033[48;5;195m",
|
||||
c("#ff0000"): "\033[48;5;196m", c("#ff005f"): "\033[48;5;197m", c("#ff0087"): "\033[48;5;198m", c("#ff00af"): "\033[48;5;199m",
|
||||
c("#ff00d7"): "\033[48;5;200m", c("#ff00ff"): "\033[48;5;201m", c("#ff5f00"): "\033[48;5;202m", c("#ff5f5f"): "\033[48;5;203m",
|
||||
c("#ff5f87"): "\033[48;5;204m", c("#ff5faf"): "\033[48;5;205m", c("#ff5fd7"): "\033[48;5;206m", c("#ff5fff"): "\033[48;5;207m",
|
||||
c("#ff8700"): "\033[48;5;208m", c("#ff875f"): "\033[48;5;209m", c("#ff8787"): "\033[48;5;210m", c("#ff87af"): "\033[48;5;211m",
|
||||
c("#ff87d7"): "\033[48;5;212m", c("#ff87ff"): "\033[48;5;213m", c("#ffaf00"): "\033[48;5;214m", c("#ffaf5f"): "\033[48;5;215m",
|
||||
c("#ffaf87"): "\033[48;5;216m", c("#ffafaf"): "\033[48;5;217m", c("#ffafd7"): "\033[48;5;218m", c("#ffafff"): "\033[48;5;219m",
|
||||
c("#ffd700"): "\033[48;5;220m", c("#ffd75f"): "\033[48;5;221m", c("#ffd787"): "\033[48;5;222m", c("#ffd7af"): "\033[48;5;223m",
|
||||
c("#ffd7d7"): "\033[48;5;224m", c("#ffd7ff"): "\033[48;5;225m", c("#ffff00"): "\033[48;5;226m", c("#ffff5f"): "\033[48;5;227m",
|
||||
c("#ffff87"): "\033[48;5;228m", c("#ffffaf"): "\033[48;5;229m", c("#ffffd7"): "\033[48;5;230m", c("#ffffff"): "\033[48;5;231m",
|
||||
c("#080808"): "\033[48;5;232m", c("#121212"): "\033[48;5;233m", c("#1c1c1c"): "\033[48;5;234m", c("#262626"): "\033[48;5;235m",
|
||||
c("#303030"): "\033[48;5;236m", c("#3a3a3a"): "\033[48;5;237m", c("#444444"): "\033[48;5;238m", c("#4e4e4e"): "\033[48;5;239m",
|
||||
c("#585858"): "\033[48;5;240m", c("#626262"): "\033[48;5;241m", c("#6c6c6c"): "\033[48;5;242m", c("#767676"): "\033[48;5;243m",
|
||||
c("#808080"): "\033[48;5;244m", c("#8a8a8a"): "\033[48;5;245m", c("#949494"): "\033[48;5;246m", c("#9e9e9e"): "\033[48;5;247m",
|
||||
c("#a8a8a8"): "\033[48;5;248m", c("#b2b2b2"): "\033[48;5;249m", c("#bcbcbc"): "\033[48;5;250m", c("#c6c6c6"): "\033[48;5;251m",
|
||||
c("#d0d0d0"): "\033[48;5;252m", c("#dadada"): "\033[48;5;253m", c("#e4e4e4"): "\033[48;5;254m", c("#eeeeee"): "\033[48;5;255m",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
func entryToEscapeSequence(table *ttyTable, entry chroma.StyleEntry) string {
|
||||
out := ""
|
||||
if entry.Bold == chroma.Yes {
|
||||
out += "\033[1m"
|
||||
}
|
||||
if entry.Underline == chroma.Yes {
|
||||
out += "\033[4m"
|
||||
}
|
||||
if entry.Italic == chroma.Yes {
|
||||
out += "\033[3m"
|
||||
}
|
||||
if entry.Colour.IsSet() {
|
||||
out += table.foreground[findClosest(table, entry.Colour)]
|
||||
}
|
||||
if entry.Background.IsSet() {
|
||||
out += table.background[findClosest(table, entry.Background)]
|
||||
}
|
||||
return out
|
||||
}
|
||||
|
||||
func findClosest(table *ttyTable, seeking chroma.Colour) chroma.Colour {
|
||||
closestColour := chroma.Colour(0)
|
||||
closest := float64(math.MaxFloat64)
|
||||
for colour := range table.foreground {
|
||||
distance := colour.Distance(seeking)
|
||||
if distance < closest {
|
||||
closest = distance
|
||||
closestColour = colour
|
||||
}
|
||||
}
|
||||
return closestColour
|
||||
}
|
||||
|
||||
func styleToEscapeSequence(table *ttyTable, style *chroma.Style) map[chroma.TokenType]string {
|
||||
style = clearBackground(style)
|
||||
out := map[chroma.TokenType]string{}
|
||||
for _, ttype := range style.Types() {
|
||||
entry := style.Get(ttype)
|
||||
out[ttype] = entryToEscapeSequence(table, entry)
|
||||
}
|
||||
return out
|
||||
}
|
||||
|
||||
// Clear the background colour.
|
||||
func clearBackground(style *chroma.Style) *chroma.Style {
|
||||
builder := style.Builder()
|
||||
bg := builder.Get(chroma.Background)
|
||||
bg.Background = 0
|
||||
bg.NoInherit = true
|
||||
builder.AddEntry(chroma.Background, bg)
|
||||
style, _ = builder.Build()
|
||||
return style
|
||||
}
|
||||
|
||||
type indexedTTYFormatter struct {
|
||||
table *ttyTable
|
||||
}
|
||||
|
||||
func (c *indexedTTYFormatter) Format(w io.Writer, style *chroma.Style, it chroma.Iterator) (err error) {
|
||||
theme := styleToEscapeSequence(c.table, style)
|
||||
for token := it(); token != chroma.EOF; token = it() {
|
||||
clr, ok := theme[token.Type]
|
||||
if !ok {
|
||||
clr, ok = theme[token.Type.SubCategory()]
|
||||
if !ok {
|
||||
clr = theme[token.Type.Category()]
|
||||
}
|
||||
}
|
||||
if clr != "" {
|
||||
fmt.Fprint(w, clr)
|
||||
}
|
||||
fmt.Fprint(w, token.Value)
|
||||
if clr != "" {
|
||||
fmt.Fprintf(w, "\033[0m")
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// TTY is an 8-colour terminal formatter.
|
||||
//
|
||||
// The Lab colour space is used to map RGB values to the most appropriate index colour.
|
||||
var TTY = Register("terminal", &indexedTTYFormatter{ttyTables[8]})
|
||||
|
||||
// TTY8 is an 8-colour terminal formatter.
|
||||
//
|
||||
// The Lab colour space is used to map RGB values to the most appropriate index colour.
|
||||
var TTY8 = Register("terminal8", &indexedTTYFormatter{ttyTables[8]})
|
||||
|
||||
// TTY16 is a 16-colour terminal formatter.
|
||||
//
|
||||
// It uses \033[3xm for normal colours and \033[90Xm for bright colours.
|
||||
//
|
||||
// The Lab colour space is used to map RGB values to the most appropriate index colour.
|
||||
var TTY16 = Register("terminal16", &indexedTTYFormatter{ttyTables[16]})
|
||||
|
||||
// TTY256 is a 256-colour terminal formatter.
|
||||
//
|
||||
// The Lab colour space is used to map RGB values to the most appropriate index colour.
|
||||
var TTY256 = Register("terminal256", &indexedTTYFormatter{ttyTables[256]})
|
42
vendor/github.com/alecthomas/chroma/formatters/tty_truecolour.go
generated
vendored
Normal file
42
vendor/github.com/alecthomas/chroma/formatters/tty_truecolour.go
generated
vendored
Normal file
@ -0,0 +1,42 @@
|
||||
package formatters
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
|
||||
"github.com/alecthomas/chroma"
|
||||
)
|
||||
|
||||
// TTY16m is a true-colour terminal formatter.
|
||||
var TTY16m = Register("terminal16m", chroma.FormatterFunc(trueColourFormatter))
|
||||
|
||||
func trueColourFormatter(w io.Writer, style *chroma.Style, it chroma.Iterator) error {
|
||||
style = clearBackground(style)
|
||||
for token := it(); token != chroma.EOF; token = it() {
|
||||
entry := style.Get(token.Type)
|
||||
if !entry.IsZero() {
|
||||
out := ""
|
||||
if entry.Bold == chroma.Yes {
|
||||
out += "\033[1m"
|
||||
}
|
||||
if entry.Underline == chroma.Yes {
|
||||
out += "\033[4m"
|
||||
}
|
||||
if entry.Italic == chroma.Yes {
|
||||
out += "\033[3m"
|
||||
}
|
||||
if entry.Colour.IsSet() {
|
||||
out += fmt.Sprintf("\033[38;2;%d;%d;%dm", entry.Colour.Red(), entry.Colour.Green(), entry.Colour.Blue())
|
||||
}
|
||||
if entry.Background.IsSet() {
|
||||
out += fmt.Sprintf("\033[48;2;%d;%d;%dm", entry.Background.Red(), entry.Background.Green(), entry.Background.Blue())
|
||||
}
|
||||
fmt.Fprint(w, out)
|
||||
}
|
||||
fmt.Fprint(w, token.Value)
|
||||
if !entry.IsZero() {
|
||||
fmt.Fprint(w, "\033[0m")
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
18
vendor/github.com/alecthomas/chroma/go.mod
generated
vendored
Normal file
18
vendor/github.com/alecthomas/chroma/go.mod
generated
vendored
Normal file
@ -0,0 +1,18 @@
|
||||
module github.com/alecthomas/chroma
|
||||
|
||||
go 1.13
|
||||
|
||||
require (
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 // indirect
|
||||
github.com/alecthomas/kong v0.2.4
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 // indirect
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964
|
||||
github.com/dlclark/regexp2 v1.4.0
|
||||
github.com/mattn/go-colorable v0.1.6
|
||||
github.com/mattn/go-isatty v0.0.12
|
||||
github.com/pkg/errors v0.9.1 // indirect
|
||||
github.com/sergi/go-diff v1.0.0 // indirect
|
||||
github.com/stretchr/testify v1.3.0 // indirect
|
||||
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 // indirect
|
||||
)
|
38
vendor/github.com/alecthomas/chroma/go.sum
generated
vendored
Normal file
38
vendor/github.com/alecthomas/chroma/go.sum
generated
vendored
Normal file
@ -0,0 +1,38 @@
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
|
||||
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
|
||||
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
|
||||
github.com/alecthomas/kong v0.2.4 h1:Y0ZBCHAvHhTHw7FFJ2FzCAAG4pkbTgA45nc7BpMhDNk=
|
||||
github.com/alecthomas/kong v0.2.4/go.mod h1:kQOmtJgV+Lb4aj+I2LEn40cbtawdWJ9Y8QLq+lElKxE=
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
|
||||
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
|
||||
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/dlclark/regexp2 v1.2.0 h1:8sAhBGEM0dRWogWqWyQeIJnxjWO6oIjl8FKqREDsGfk=
|
||||
github.com/dlclark/regexp2 v1.2.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
|
||||
github.com/dlclark/regexp2 v1.4.0 h1:F1rxgk7p4uKjwIQxBs9oAXe5CqrXlCduYEJvrF4u93E=
|
||||
github.com/dlclark/regexp2 v1.4.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
|
||||
github.com/mattn/go-colorable v0.1.6 h1:6Su7aK7lXmJ/U79bYtBjLNaha4Fs1Rg9plHpcH+vvnE=
|
||||
github.com/mattn/go-colorable v0.1.6/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
|
||||
github.com/mattn/go-isatty v0.0.12 h1:wuysRhFDzyxgEmMf5xjvJ2M9dZoWAXNNr5LSBS7uHXY=
|
||||
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
|
||||
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
|
||||
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
|
||||
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
|
||||
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
|
||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
|
||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||
github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
|
||||
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
|
||||
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 h1:opSr2sbRXk5X5/givKrrKj9HXxFpW2sdCiP8MJSKLQY=
|
||||
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
76
vendor/github.com/alecthomas/chroma/iterator.go
generated
vendored
Normal file
76
vendor/github.com/alecthomas/chroma/iterator.go
generated
vendored
Normal file
@ -0,0 +1,76 @@
|
||||
package chroma
|
||||
|
||||
import "strings"
|
||||
|
||||
// An Iterator across tokens.
|
||||
//
|
||||
// EOF will be returned at the end of the Token stream.
|
||||
//
|
||||
// If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover.
|
||||
type Iterator func() Token
|
||||
|
||||
// Tokens consumes all tokens from the iterator and returns them as a slice.
|
||||
func (i Iterator) Tokens() []Token {
|
||||
var out []Token
|
||||
for t := i(); t != EOF; t = i() {
|
||||
out = append(out, t)
|
||||
}
|
||||
return out
|
||||
}
|
||||
|
||||
// Concaterator concatenates tokens from a series of iterators.
|
||||
func Concaterator(iterators ...Iterator) Iterator {
|
||||
return func() Token {
|
||||
for len(iterators) > 0 {
|
||||
t := iterators[0]()
|
||||
if t != EOF {
|
||||
return t
|
||||
}
|
||||
iterators = iterators[1:]
|
||||
}
|
||||
return EOF
|
||||
}
|
||||
}
|
||||
|
||||
// Literator converts a sequence of literal Tokens into an Iterator.
|
||||
func Literator(tokens ...Token) Iterator {
|
||||
return func() Token {
|
||||
if len(tokens) == 0 {
|
||||
return EOF
|
||||
}
|
||||
token := tokens[0]
|
||||
tokens = tokens[1:]
|
||||
return token
|
||||
}
|
||||
}
|
||||
|
||||
// SplitTokensIntoLines splits tokens containing newlines in two.
|
||||
func SplitTokensIntoLines(tokens []Token) (out [][]Token) {
|
||||
var line []Token // nolint: prealloc
|
||||
for _, token := range tokens {
|
||||
for strings.Contains(token.Value, "\n") {
|
||||
parts := strings.SplitAfterN(token.Value, "\n", 2)
|
||||
// Token becomes the tail.
|
||||
token.Value = parts[1]
|
||||
|
||||
// Append the head to the line and flush the line.
|
||||
clone := token.Clone()
|
||||
clone.Value = parts[0]
|
||||
line = append(line, clone)
|
||||
out = append(out, line)
|
||||
line = nil
|
||||
}
|
||||
line = append(line, token)
|
||||
}
|
||||
if len(line) > 0 {
|
||||
out = append(out, line)
|
||||
}
|
||||
// Strip empty trailing token line.
|
||||
if len(out) > 0 {
|
||||
last := out[len(out)-1]
|
||||
if len(last) == 1 && last[0].Value == "" {
|
||||
out = out[:len(out)-1]
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
125
vendor/github.com/alecthomas/chroma/lexer.go
generated
vendored
Normal file
125
vendor/github.com/alecthomas/chroma/lexer.go
generated
vendored
Normal file
@ -0,0 +1,125 @@
|
||||
package chroma
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
)
|
||||
|
||||
var (
|
||||
defaultOptions = &TokeniseOptions{
|
||||
State: "root",
|
||||
EnsureLF: true,
|
||||
}
|
||||
)
|
||||
|
||||
// Config for a lexer.
|
||||
type Config struct {
|
||||
// Name of the lexer.
|
||||
Name string
|
||||
|
||||
// Shortcuts for the lexer
|
||||
Aliases []string
|
||||
|
||||
// File name globs
|
||||
Filenames []string
|
||||
|
||||
// Secondary file name globs
|
||||
AliasFilenames []string
|
||||
|
||||
// MIME types
|
||||
MimeTypes []string
|
||||
|
||||
// Regex matching is case-insensitive.
|
||||
CaseInsensitive bool
|
||||
|
||||
// Regex matches all characters.
|
||||
DotAll bool
|
||||
|
||||
// Regex does not match across lines ($ matches EOL).
|
||||
//
|
||||
// Defaults to multiline.
|
||||
NotMultiline bool
|
||||
|
||||
// Don't strip leading and trailing newlines from the input.
|
||||
// DontStripNL bool
|
||||
|
||||
// Strip all leading and trailing whitespace from the input
|
||||
// StripAll bool
|
||||
|
||||
// Make sure that the input ends with a newline. This
|
||||
// is required for some lexers that consume input linewise.
|
||||
EnsureNL bool
|
||||
|
||||
// If given and greater than 0, expand tabs in the input.
|
||||
// TabSize int
|
||||
|
||||
// Priority of lexer.
|
||||
//
|
||||
// If this is 0 it will be treated as a default of 1.
|
||||
Priority float32
|
||||
}
|
||||
|
||||
// Token output to formatter.
|
||||
type Token struct {
|
||||
Type TokenType `json:"type"`
|
||||
Value string `json:"value"`
|
||||
}
|
||||
|
||||
func (t *Token) String() string { return t.Value }
|
||||
func (t *Token) GoString() string { return fmt.Sprintf("&Token{%s, %q}", t.Type, t.Value) }
|
||||
|
||||
// Clone returns a clone of the Token.
|
||||
func (t *Token) Clone() Token {
|
||||
return *t
|
||||
}
|
||||
|
||||
// EOF is returned by lexers at the end of input.
|
||||
var EOF Token
|
||||
|
||||
// TokeniseOptions contains options for tokenisers.
|
||||
type TokeniseOptions struct {
|
||||
// State to start tokenisation in. Defaults to "root".
|
||||
State string
|
||||
// Nested tokenisation.
|
||||
Nested bool
|
||||
|
||||
// If true, all EOLs are converted into LF
|
||||
// by replacing CRLF and CR
|
||||
EnsureLF bool
|
||||
}
|
||||
|
||||
// A Lexer for tokenising source code.
|
||||
type Lexer interface {
|
||||
// Config describing the features of the Lexer.
|
||||
Config() *Config
|
||||
// Tokenise returns an Iterator over tokens in text.
|
||||
Tokenise(options *TokeniseOptions, text string) (Iterator, error)
|
||||
}
|
||||
|
||||
// Lexers is a slice of lexers sortable by name.
|
||||
type Lexers []Lexer
|
||||
|
||||
func (l Lexers) Len() int { return len(l) }
|
||||
func (l Lexers) Swap(i, j int) { l[i], l[j] = l[j], l[i] }
|
||||
func (l Lexers) Less(i, j int) bool { return l[i].Config().Name < l[j].Config().Name }
|
||||
|
||||
// PrioritisedLexers is a slice of lexers sortable by priority.
|
||||
type PrioritisedLexers []Lexer
|
||||
|
||||
func (l PrioritisedLexers) Len() int { return len(l) }
|
||||
func (l PrioritisedLexers) Swap(i, j int) { l[i], l[j] = l[j], l[i] }
|
||||
func (l PrioritisedLexers) Less(i, j int) bool {
|
||||
ip := l[i].Config().Priority
|
||||
if ip == 0 {
|
||||
ip = 1
|
||||
}
|
||||
jp := l[j].Config().Priority
|
||||
if jp == 0 {
|
||||
jp = 1
|
||||
}
|
||||
return ip > jp
|
||||
}
|
||||
|
||||
// Analyser determines how appropriate this lexer is for the given text.
|
||||
type Analyser interface {
|
||||
AnalyseText(text string) float32
|
||||
}
|
37
vendor/github.com/alecthomas/chroma/lexers/README.md
generated
vendored
Normal file
37
vendor/github.com/alecthomas/chroma/lexers/README.md
generated
vendored
Normal file
@ -0,0 +1,37 @@
|
||||
# Lexer tests
|
||||
|
||||
The tests in this directory feed a known input `testdata/<name>.actual` into the parser for `<name>` and check
|
||||
that its output matches `<name>.exported`.
|
||||
|
||||
## Running the tests
|
||||
|
||||
Run the tests as normal:
|
||||
```go
|
||||
go test ./lexers
|
||||
```
|
||||
|
||||
## Update existing tests
|
||||
When you add a new test data file (`*.actual`), you need to regenerate all tests. That's how Chroma creates the `*.expected` test file based on the corresponding lexer.
|
||||
|
||||
To regenerate all tests, type in your terminal:
|
||||
|
||||
```go
|
||||
RECORD=true go test ./lexers
|
||||
```
|
||||
|
||||
This first sets the `RECORD` environment variable to `true`. Then it runs `go test` on the `./lexers` directory of the Chroma project.
|
||||
|
||||
(That environment variable tells Chroma it needs to output test data. After running `go test ./lexers` you can remove or reset that variable.)
|
||||
|
||||
### Windows users
|
||||
Windows users will find that the `RECORD=true go test ./lexers` command fails in both the standard command prompt terminal and in PowerShell.
|
||||
|
||||
Instead we have to perform both steps separately:
|
||||
|
||||
- Set the `RECORD` environment variable to `true`.
|
||||
+ In the regular command prompt window, the `set` command sets an environment variable for the current session: `set RECORD=true`. See [this page](https://superuser.com/questions/212150/how-to-set-env-variable-in-windows-cmd-line) for more.
|
||||
+ In PowerShell, you can use the `$env:RECORD = 'true'` command for that. See [this article](https://mcpmag.com/articles/2019/03/28/environment-variables-in-powershell.aspx) for more.
|
||||
+ You can also make a persistent environment variable by hand in the Windows computer settings. See [this article](https://www.computerhope.com/issues/ch000549.htm) for how.
|
||||
- When the environment variable is set, run `go tests ./lexers`.
|
||||
|
||||
Chroma will now regenerate the test files and print its results to the console window.
|
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user