mirror of
https://github.com/cheat/cheat.git
synced 2026-03-08 03:33:33 +01:00
Compare commits
10 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
cc85a4bdb1 | ||
|
|
7908a678df | ||
|
|
7c0eacb53d | ||
|
|
4bf804ac60 | ||
|
|
33c5918087 | ||
|
|
d34177729d | ||
|
|
7fa50328d7 | ||
|
|
1790aec85d | ||
|
|
6bf51e758f | ||
|
|
242da8c89a |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -1,2 +1,5 @@
|
||||
dist
|
||||
tags
|
||||
.tmp
|
||||
*.test
|
||||
.claude
|
||||
|
||||
117
CLAUDE.md
Normal file
117
CLAUDE.md
Normal file
@@ -0,0 +1,117 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Common Development Commands
|
||||
|
||||
### Building
|
||||
```bash
|
||||
# Build for your architecture
|
||||
make build
|
||||
|
||||
# Build release binaries for all platforms
|
||||
make build-release
|
||||
|
||||
# Install cheat to your PATH
|
||||
make install
|
||||
```
|
||||
|
||||
### Testing and Quality Checks
|
||||
```bash
|
||||
# Run all tests
|
||||
make test
|
||||
go test ./...
|
||||
|
||||
# Run a single test
|
||||
go test -run TestFunctionName ./internal/package_name
|
||||
|
||||
# Generate test coverage report
|
||||
make coverage
|
||||
|
||||
# Run linter (revive)
|
||||
make lint
|
||||
|
||||
# Run go vet
|
||||
make vet
|
||||
|
||||
# Format code
|
||||
make fmt
|
||||
|
||||
# Run all checks (vendor, fmt, lint, vet, test)
|
||||
make check
|
||||
```
|
||||
|
||||
### Development Setup
|
||||
```bash
|
||||
# Install development dependencies (revive linter, scc)
|
||||
make setup
|
||||
|
||||
# Update and verify vendored dependencies
|
||||
make vendor-update
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
The `cheat` command-line tool is organized into several key packages:
|
||||
|
||||
### Command Layer (`cmd/cheat/`)
|
||||
- `main.go`: Entry point, argument parsing, command routing
|
||||
- `cmd_*.go`: Individual command implementations (view, edit, list, search, etc.)
|
||||
- Commands are selected based on docopt parsed arguments
|
||||
|
||||
### Core Internal Packages
|
||||
|
||||
1. **`internal/config`**: Configuration management
|
||||
- Loads YAML config from platform-specific paths
|
||||
- Manages editor, pager, colorization settings
|
||||
- Validates and expands cheatpath configurations
|
||||
|
||||
2. **`internal/cheatpath`**: Cheatsheet path management
|
||||
- Represents collections of cheatsheets on filesystem
|
||||
- Handles read-only vs writable paths
|
||||
- Supports filtering and validation
|
||||
|
||||
3. **`internal/sheet`**: Individual cheatsheet handling
|
||||
- Parses YAML frontmatter for tags and syntax
|
||||
- Implements syntax highlighting via Chroma
|
||||
- Provides search functionality within sheets
|
||||
|
||||
4. **`internal/sheets`**: Collection operations
|
||||
- Loads sheets from multiple cheatpaths
|
||||
- Consolidates duplicates (local overrides global)
|
||||
- Filters by tags and sorts results
|
||||
|
||||
5. **`internal/display`**: Output formatting
|
||||
- Writes to stdout or pager
|
||||
- Handles text formatting and indentation
|
||||
|
||||
6. **`internal/repo`**: Git repository management
|
||||
- Clones community cheatsheet repositories
|
||||
- Updates existing repositories
|
||||
|
||||
### Key Design Patterns
|
||||
|
||||
- **Filesystem-based storage**: Cheatsheets are plain text files
|
||||
- **Override mechanism**: Local sheets override community sheets with same name
|
||||
- **Tag system**: Sheets can be categorized with tags in frontmatter
|
||||
- **Multiple cheatpaths**: Supports personal, community, and directory-scoped sheets
|
||||
|
||||
### Sheet Format
|
||||
|
||||
Cheatsheets are plain text files optionally prefixed with YAML frontmatter:
|
||||
```
|
||||
---
|
||||
syntax: bash
|
||||
tags: [ networking, ssh ]
|
||||
---
|
||||
# SSH tunneling example
|
||||
ssh -L 8080:localhost:80 user@remote
|
||||
```
|
||||
|
||||
### Working with the Codebase
|
||||
|
||||
- Always check for `.git` directories and skip them during filesystem walks
|
||||
- Use `go-git` for repository operations, not exec'ing git commands
|
||||
- Platform-specific paths are handled in `internal/config/paths.go`
|
||||
- Color output uses ANSI codes via the Chroma library
|
||||
- Test files use the `internal/mock` package for test data
|
||||
@@ -1,48 +1,14 @@
|
||||
CONTRIBUTING
|
||||
Contributing
|
||||
============
|
||||
Do you want to contribute to `cheat`? There are a few ways to help:
|
||||
|
||||
#### Submit a cheatsheet ####
|
||||
Do you have a witty bash one-liner to share? [Open a pull-request][pr] against
|
||||
the [cheatsheets][] repository. (The `cheat` executable source code lives in
|
||||
[cheat/cheat][cheat]. Cheatsheet content lives in
|
||||
[cheat/cheatsheets][cheatsheets].)
|
||||
Thank you for your interest in `cheat`.
|
||||
|
||||
#### Report a bug ####
|
||||
Did you find a bug? Report it in the [issue tracker][issues]. (But before you
|
||||
do, please look through the open issues to make sure that it hasn't already
|
||||
been reported.)
|
||||
Pull requests are no longer being accepted, and have been disabled on this
|
||||
repository. The maintainer is not currently reviewing or merging external code
|
||||
contributions.
|
||||
|
||||
#### Add a feature ####
|
||||
Do you have a feature that you'd like to contribute? Propose it in the [issue
|
||||
tracker][issues] to discuss with the maintainer whether it would be considered
|
||||
for merging.
|
||||
Bug reports are still welcome. If you've found a bug, please open an issue in
|
||||
the [issue tracker][issues]. Before doing so, please search through the
|
||||
existing open issues to make sure it hasn't already been reported.
|
||||
|
||||
`cheat` is mostly mature and feature-complete, but may still have some room for
|
||||
new features. See [HACKING.md][hacking] for a quick-start guide to `cheat`
|
||||
development.
|
||||
|
||||
#### Add documentation ####
|
||||
Did you encounter features, bugs, edge-cases, use-cases, or environment
|
||||
considerations that were undocumented or under-documented? Add them to the
|
||||
[wiki][]. (You may also open a pull-request against the `README`, if
|
||||
appropriate.)
|
||||
|
||||
Do you enjoy technical writing or proofreading? Help keep the documentation
|
||||
error-free and well-organized.
|
||||
|
||||
#### Spread the word ####
|
||||
Are you unable to do the above, but still want to contribute? You can help
|
||||
`cheat` simply by telling others about it. Share it with friends and coworkers
|
||||
that might benefit from using it.
|
||||
|
||||
#### Pull Requests ####
|
||||
Please open all pull-requests against the `develop` branch.
|
||||
|
||||
|
||||
[cheat]: https://github.com/cheat/cheat
|
||||
[cheatsheets]: https://github.com/cheat/cheatsheets
|
||||
[hacking]: HACKING.md
|
||||
[issues]: https://github.com/cheat/cheat/issues
|
||||
[pr]: https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork
|
||||
[wiki]: https://github.com/cheat/cheat/wiki
|
||||
|
||||
250
HACKING.md
250
HACKING.md
@@ -1,57 +1,241 @@
|
||||
Hacking
|
||||
=======
|
||||
The following is a quickstart guide for developing `cheat`.
|
||||
# Hacking Guide
|
||||
|
||||
## 1. Install system dependencies
|
||||
Before you begin, you must install a handful of system dependencies. The
|
||||
following are required, and must be available on your `PATH`:
|
||||
This document provides a comprehensive guide for developing `cheat`, including setup, architecture overview, and code patterns.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Install system dependencies
|
||||
|
||||
The following are required and must be available on your `PATH`:
|
||||
- `git`
|
||||
- `go` (>= 1.17 is recommended)
|
||||
- `go` (>= 1.19 is recommended)
|
||||
- `make`
|
||||
|
||||
The following dependencies are optional:
|
||||
Optional dependencies:
|
||||
- `docker`
|
||||
- `pandoc` (necessary to generate a `man` page)
|
||||
|
||||
## 2. Install utility applications
|
||||
Run `make setup` to install `scc` and `revive`, which are used by various
|
||||
`make` targets.
|
||||
### 2. Install utility applications
|
||||
Run `make setup` to install `scc` and `revive`, which are used by various `make` targets.
|
||||
|
||||
## 3. Development workflow
|
||||
After your environment has been configured, your development workflow will
|
||||
resemble the following:
|
||||
### 3. Development workflow
|
||||
|
||||
1. Make changes to the `cheat` source code.
|
||||
2. Run `make test` to run unit-tests.
|
||||
3. Fix compiler errors and failing tests as necessary.
|
||||
4. Run `make`. A `cheat` executable will be written to the `dist` directory.
|
||||
5. Use the new executable by running `dist/cheat <command>`.
|
||||
6. Run `make install` to install `cheat` to your `PATH`.
|
||||
7. Run `make build-release` to build cross-platform binaries in `dist`.
|
||||
8. Run `make clean` to clean the `dist` directory when desired.
|
||||
1. Make changes to the `cheat` source code
|
||||
2. Run `make test` to run unit-tests
|
||||
3. Fix compiler errors and failing tests as necessary
|
||||
4. Run `make build`. A `cheat` executable will be written to the `dist` directory
|
||||
5. Use the new executable by running `dist/cheat <command>`
|
||||
6. Run `make install` to install `cheat` to your `PATH`
|
||||
7. Run `make build-release` to build cross-platform binaries in `dist`
|
||||
8. Run `make clean` to clean the `dist` directory when desired
|
||||
|
||||
You may run `make help` to see a list of available `make` commands.
|
||||
|
||||
### Developing with docker
|
||||
It may be useful to test your changes within a pristine environment. An
|
||||
Alpine-based docker container has been provided for that purpose.
|
||||
### 4. Testing
|
||||
|
||||
If you would like to build the docker container, run:
|
||||
```sh
|
||||
#### Unit Tests
|
||||
Run unit tests with:
|
||||
```bash
|
||||
make test
|
||||
```
|
||||
|
||||
#### Integration Tests
|
||||
Integration tests that require network access are separated using build tags. Run them with:
|
||||
```bash
|
||||
make test-integration
|
||||
```
|
||||
|
||||
To run all tests (unit and integration):
|
||||
```bash
|
||||
make test-all
|
||||
```
|
||||
|
||||
#### Test Coverage
|
||||
Generate a coverage report with:
|
||||
```bash
|
||||
make coverage # HTML report
|
||||
make coverage-text # Terminal output
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Package Structure
|
||||
|
||||
The `cheat` application follows a clean architecture with well-separated concerns:
|
||||
|
||||
- **`cmd/cheat/`**: Command layer with argument parsing and command routing
|
||||
- **`internal/config`**: Configuration management (YAML loading, validation, paths)
|
||||
- **`internal/cheatpath`**: Cheatsheet path management (collections, filtering)
|
||||
- **`internal/sheet`**: Individual cheatsheet handling (parsing, search, highlighting)
|
||||
- **`internal/sheets`**: Collection operations (loading, consolidation, filtering)
|
||||
- **`internal/display`**: Output formatting (pager integration, colorization)
|
||||
- **`internal/repo`**: Git repository management for community sheets
|
||||
|
||||
### Key Design Patterns
|
||||
|
||||
- **Filesystem-based storage**: Cheatsheets are plain text files
|
||||
- **Override mechanism**: Local sheets override community sheets with same name
|
||||
- **Tag system**: Sheets can be categorized with tags in frontmatter
|
||||
- **Multiple cheatpaths**: Supports personal, community, and directory-scoped sheets
|
||||
|
||||
## Core Types and Functions
|
||||
|
||||
### Config (`internal/config`)
|
||||
|
||||
The main configuration structure:
|
||||
|
||||
```go
|
||||
type Config struct {
|
||||
Colorize bool `yaml:"colorize"`
|
||||
Editor string `yaml:"editor"`
|
||||
Cheatpaths []cp.Cheatpath `yaml:"cheatpaths"`
|
||||
Style string `yaml:"style"`
|
||||
Formatter string `yaml:"formatter"`
|
||||
Pager string `yaml:"pager"`
|
||||
Path string
|
||||
}
|
||||
```
|
||||
|
||||
Key functions:
|
||||
- `New(opts, confPath, resolve)` - Load config from file
|
||||
- `Validate()` - Validate configuration values
|
||||
- `Editor()` - Get editor from environment or defaults (package-level function)
|
||||
- `Pager()` - Get pager from environment or defaults (package-level function)
|
||||
|
||||
### Cheatpath (`internal/cheatpath`)
|
||||
|
||||
Represents a directory containing cheatsheets:
|
||||
|
||||
```go
|
||||
type Cheatpath struct {
|
||||
Name string // Friendly name (e.g., "personal")
|
||||
Path string // Filesystem path
|
||||
Tags []string // Tags applied to all sheets in this path
|
||||
ReadOnly bool // Whether sheets can be modified
|
||||
}
|
||||
```
|
||||
|
||||
### Sheet (`internal/sheet`)
|
||||
|
||||
Represents an individual cheatsheet:
|
||||
|
||||
```go
|
||||
type Sheet struct {
|
||||
Title string // Sheet name (from filename)
|
||||
CheatPath string // Name of the cheatpath this sheet belongs to
|
||||
Path string // Full filesystem path
|
||||
Text string // Content (without frontmatter)
|
||||
Tags []string // Combined tags (from frontmatter + cheatpath)
|
||||
Syntax string // Syntax for highlighting
|
||||
ReadOnly bool // Whether sheet can be edited
|
||||
}
|
||||
```
|
||||
|
||||
Key methods:
|
||||
- `New(title, cheatpath, path, tags, readOnly)` - Load from file
|
||||
- `Search(reg)` - Search content with a compiled regexp
|
||||
- `Colorize(conf)` - Apply syntax highlighting (modifies sheet in place)
|
||||
- `Tagged(needle)` - Check if sheet has the given tag
|
||||
|
||||
## Common Operations
|
||||
|
||||
### Loading and Displaying a Sheet
|
||||
|
||||
```go
|
||||
// Load sheet
|
||||
s, err := sheet.New("tar", "personal", "/path/to/tar", []string{"personal"}, false)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Apply syntax highlighting (modifies sheet in place)
|
||||
s.Colorize(conf)
|
||||
|
||||
// Display with pager
|
||||
display.Write(s.Text, conf)
|
||||
```
|
||||
|
||||
### Working with Sheet Collections
|
||||
|
||||
```go
|
||||
// Load all sheets from cheatpaths (returns a slice of maps, one per cheatpath)
|
||||
allSheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Consolidate to handle duplicates (later cheatpaths take precedence)
|
||||
consolidated := sheets.Consolidate(allSheets)
|
||||
|
||||
// Filter by tag (operates on the slice of maps)
|
||||
filtered := sheets.Filter(allSheets, []string{"networking"})
|
||||
|
||||
// Sort alphabetically (returns a sorted slice)
|
||||
sorted := sheets.Sort(consolidated)
|
||||
```
|
||||
|
||||
### Sheet Format
|
||||
|
||||
Cheatsheets are plain text files that may begin with YAML frontmatter:
|
||||
|
||||
```yaml
|
||||
---
|
||||
syntax: bash
|
||||
tags: [networking, linux, ssh]
|
||||
---
|
||||
# Connect to remote server
|
||||
ssh user@hostname
|
||||
|
||||
# Copy files over SSH
|
||||
scp local_file user@hostname:/remote/path
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
Run tests with:
|
||||
```bash
|
||||
make test # Run all tests
|
||||
make coverage # Generate coverage report
|
||||
go test ./... # Go test directly
|
||||
```
|
||||
|
||||
Test files follow Go conventions:
|
||||
- `*_test.go` files in same package
|
||||
- Table-driven tests for multiple scenarios
|
||||
- Mock data in `internal/mock` package
|
||||
|
||||
## Error Handling
|
||||
|
||||
The codebase follows consistent error handling patterns:
|
||||
- Functions return explicit errors
|
||||
- Errors are wrapped with context using `fmt.Errorf`
|
||||
- User-facing errors are written to stderr
|
||||
|
||||
Example:
|
||||
```go
|
||||
sheet, err := sheet.New(path, tags, false)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to load sheet: %w", err)
|
||||
}
|
||||
```
|
||||
|
||||
## Developing with Docker
|
||||
|
||||
It may be useful to test your changes within a pristine environment. An Alpine-based docker container has been provided for that purpose.
|
||||
|
||||
Build the docker container:
|
||||
```bash
|
||||
make docker-setup
|
||||
```
|
||||
|
||||
To shell into the container, run:
|
||||
```sh
|
||||
Shell into the container:
|
||||
```bash
|
||||
make docker-sh
|
||||
```
|
||||
|
||||
The `cheat` source code will be mounted at `/app` within the container.
|
||||
|
||||
If you would like to destroy this container, you may run:
|
||||
```sh
|
||||
To destroy the container:
|
||||
```bash
|
||||
make distclean
|
||||
```
|
||||
|
||||
[go]: https://go.dev/
|
||||
|
||||
@@ -9,20 +9,20 @@ On Unix-like systems, you may simply paste the following snippet into your termi
|
||||
|
||||
```sh
|
||||
cd /tmp \
|
||||
&& wget https://github.com/cheat/cheat/releases/download/4.4.1/cheat-linux-amd64.gz \
|
||||
&& wget https://github.com/cheat/cheat/releases/download/4.5.0/cheat-linux-amd64.gz \
|
||||
&& gunzip cheat-linux-amd64.gz \
|
||||
&& chmod +x cheat-linux-amd64 \
|
||||
&& sudo mv cheat-linux-amd64 /usr/local/bin/cheat
|
||||
```
|
||||
|
||||
You may need to need to change the version number (`4.4.1`) and the archive
|
||||
You may need to need to change the version number (`4.5.0`) and the archive
|
||||
(`cheat-linux-amd64.gz`) depending on your platform.
|
||||
|
||||
See the [releases page][releases] for a list of supported platforms.
|
||||
|
||||
#### Windows
|
||||
TODO: community support is requested here. Please open a PR if you'd like to
|
||||
contribute installation instructions for Windows.
|
||||
On Windows, download the appropriate binary from the [releases page][releases],
|
||||
unzip the archive, and place the `cheat.exe` executable on your `PATH`.
|
||||
|
||||
### Install via `go install`
|
||||
If you have `go` version `>=1.17` available on your `PATH`, you can install
|
||||
|
||||
114
Makefile
114
Makefile
@@ -3,6 +3,9 @@ makefile := $(realpath $(lastword $(MAKEFILE_LIST)))
|
||||
cmd_dir := ./cmd/cheat
|
||||
dist_dir := ./dist
|
||||
|
||||
# parallel jobs for build-release (can be overridden)
|
||||
JOBS ?= 8
|
||||
|
||||
# executables
|
||||
CAT := cat
|
||||
COLUMN := column
|
||||
@@ -31,6 +34,7 @@ TMPDIR := /tmp
|
||||
# release binaries
|
||||
releases := \
|
||||
$(dist_dir)/cheat-darwin-amd64 \
|
||||
$(dist_dir)/cheat-darwin-arm64 \
|
||||
$(dist_dir)/cheat-linux-386 \
|
||||
$(dist_dir)/cheat-linux-amd64 \
|
||||
$(dist_dir)/cheat-linux-arm5 \
|
||||
@@ -39,76 +43,83 @@ releases := \
|
||||
$(dist_dir)/cheat-linux-arm7 \
|
||||
$(dist_dir)/cheat-netbsd-amd64 \
|
||||
$(dist_dir)/cheat-openbsd-amd64 \
|
||||
$(dist_dir)/cheat-plan9-amd64 \
|
||||
$(dist_dir)/cheat-solaris-amd64 \
|
||||
$(dist_dir)/cheat-windows-amd64.exe
|
||||
|
||||
## build: build an executable for your architecture
|
||||
.PHONY: build
|
||||
build: | clean $(dist_dir) generate fmt lint vet vendor man
|
||||
build: | clean $(dist_dir) fmt lint vet vendor man
|
||||
$(GO) build $(BUILD_FLAGS) -o $(dist_dir)/cheat $(cmd_dir)
|
||||
|
||||
## build-release: build release executables
|
||||
# Runs prepare once, then builds all binaries in parallel
|
||||
# Override jobs with: make build-release JOBS=16
|
||||
.PHONY: build-release
|
||||
build-release: $(releases)
|
||||
build-release: prepare
|
||||
$(MAKE) -j$(JOBS) $(releases)
|
||||
|
||||
# cheat-darwin-amd64
|
||||
$(dist_dir)/cheat-darwin-amd64: prepare
|
||||
$(dist_dir)/cheat-darwin-amd64:
|
||||
GOARCH=amd64 GOOS=darwin \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-darwin-arm64
|
||||
$(dist_dir)/cheat-darwin-arm64:
|
||||
GOARCH=arm64 GOOS=darwin \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-386
|
||||
$(dist_dir)/cheat-linux-386: prepare
|
||||
$(dist_dir)/cheat-linux-386:
|
||||
GOARCH=386 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-amd64
|
||||
$(dist_dir)/cheat-linux-amd64: prepare
|
||||
$(dist_dir)/cheat-linux-amd64:
|
||||
GOARCH=amd64 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm5
|
||||
$(dist_dir)/cheat-linux-arm5: prepare
|
||||
$(dist_dir)/cheat-linux-arm5:
|
||||
GOARCH=arm GOOS=linux GOARM=5 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm6
|
||||
$(dist_dir)/cheat-linux-arm6: prepare
|
||||
$(dist_dir)/cheat-linux-arm6:
|
||||
GOARCH=arm GOOS=linux GOARM=6 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm7
|
||||
$(dist_dir)/cheat-linux-arm7: prepare
|
||||
$(dist_dir)/cheat-linux-arm7:
|
||||
GOARCH=arm GOOS=linux GOARM=7 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm64
|
||||
$(dist_dir)/cheat-linux-arm64: prepare
|
||||
$(dist_dir)/cheat-linux-arm64:
|
||||
GOARCH=arm64 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-netbsd-amd64
|
||||
$(dist_dir)/cheat-netbsd-amd64: prepare
|
||||
$(dist_dir)/cheat-netbsd-amd64:
|
||||
GOARCH=amd64 GOOS=netbsd \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-openbsd-amd64
|
||||
$(dist_dir)/cheat-openbsd-amd64: prepare
|
||||
$(dist_dir)/cheat-openbsd-amd64:
|
||||
GOARCH=amd64 GOOS=openbsd \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-plan9-amd64
|
||||
$(dist_dir)/cheat-plan9-amd64: prepare
|
||||
$(dist_dir)/cheat-plan9-amd64:
|
||||
GOARCH=amd64 GOOS=plan9 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-solaris-amd64
|
||||
$(dist_dir)/cheat-solaris-amd64: prepare
|
||||
$(dist_dir)/cheat-solaris-amd64:
|
||||
GOARCH=amd64 GOOS=solaris \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-windows-amd64
|
||||
$(dist_dir)/cheat-windows-amd64.exe: prepare
|
||||
$(dist_dir)/cheat-windows-amd64.exe:
|
||||
GOARCH=amd64 GOOS=windows \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(ZIP) $@.zip $@ -j
|
||||
|
||||
@@ -116,9 +127,9 @@ $(dist_dir)/cheat-windows-amd64.exe: prepare
|
||||
$(dist_dir):
|
||||
$(MKDIR) $(dist_dir)
|
||||
|
||||
.PHONY: generate
|
||||
generate:
|
||||
$(GO) generate $(cmd_dir)
|
||||
# .tmp
|
||||
.tmp:
|
||||
$(MKDIR) .tmp
|
||||
|
||||
## install: build and install cheat on your PATH
|
||||
.PHONY: install
|
||||
@@ -128,7 +139,8 @@ install: build
|
||||
## clean: remove compiled executables
|
||||
.PHONY: clean
|
||||
clean:
|
||||
$(RM) -f $(dist_dir)/* $(cmd_dir)/str_config.go $(cmd_dir)/str_usage.go
|
||||
$(RM) -f $(dist_dir)/*
|
||||
$(RM) -rf .tmp
|
||||
|
||||
## distclean: remove the tags file
|
||||
.PHONY: distclean
|
||||
@@ -139,7 +151,8 @@ distclean:
|
||||
## setup: install revive (linter) and scc (sloc tool)
|
||||
.PHONY: setup
|
||||
setup:
|
||||
GO111MODULE=off $(GO) get -u github.com/boyter/scc github.com/mgechev/revive
|
||||
$(GO) install github.com/boyter/scc@latest
|
||||
$(GO) install github.com/mgechev/revive@latest
|
||||
|
||||
## sloc: count "semantic lines of code"
|
||||
.PHONY: sloc
|
||||
@@ -163,6 +176,7 @@ vendor:
|
||||
$(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify
|
||||
|
||||
## vendor-update: update vendored dependencies
|
||||
.PHONY: vendor-update
|
||||
vendor-update:
|
||||
$(GO) get -t -u ./... && $(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify
|
||||
|
||||
@@ -186,18 +200,70 @@ vet:
|
||||
test:
|
||||
$(GO) test ./...
|
||||
|
||||
## test-integration: run integration tests (requires network)
|
||||
.PHONY: test-integration
|
||||
test-integration:
|
||||
$(GO) test -tags=integration -count=1 ./...
|
||||
|
||||
## test-all: run all tests (unit and integration)
|
||||
.PHONY: test-all
|
||||
test-all: test test-integration
|
||||
|
||||
## test-fuzz: run quick fuzz tests for security-critical functions
|
||||
.PHONY: test-fuzz
|
||||
test-fuzz:
|
||||
@./build/fuzz.sh 15s
|
||||
|
||||
## test-fuzz-long: run extended fuzz tests (10 minutes each)
|
||||
.PHONY: test-fuzz-long
|
||||
test-fuzz-long:
|
||||
@./build/fuzz.sh 10m
|
||||
|
||||
## coverage: generate a test coverage report
|
||||
.PHONY: coverage
|
||||
coverage:
|
||||
$(GO) test ./... -coverprofile=$(TMPDIR)/cheat-coverage.out && \
|
||||
$(GO) tool cover -html=$(TMPDIR)/cheat-coverage.out
|
||||
coverage: .tmp
|
||||
$(GO) test ./... -coverprofile=.tmp/cheat-coverage.out && \
|
||||
$(GO) tool cover -html=.tmp/cheat-coverage.out -o .tmp/cheat-coverage.html && \
|
||||
echo "Coverage report generated: .tmp/cheat-coverage.html" && \
|
||||
(sensible-browser .tmp/cheat-coverage.html 2>/dev/null || \
|
||||
xdg-open .tmp/cheat-coverage.html 2>/dev/null || \
|
||||
open .tmp/cheat-coverage.html 2>/dev/null || \
|
||||
echo "Please open .tmp/cheat-coverage.html in your browser")
|
||||
|
||||
## coverage-text: show test coverage by function in terminal
|
||||
.PHONY: coverage-text
|
||||
coverage-text: .tmp
|
||||
$(GO) test ./... -coverprofile=.tmp/cheat-coverage.out && \
|
||||
$(GO) tool cover -func=.tmp/cheat-coverage.out | $(SORT) -k3 -n
|
||||
|
||||
## benchmark: run performance benchmarks
|
||||
.PHONY: benchmark
|
||||
benchmark: .tmp
|
||||
$(GO) test -tags=integration -bench=. -benchtime=10s -benchmem ./cmd/cheat | tee .tmp/benchmark-latest.txt && \
|
||||
$(RM) -f cheat.test
|
||||
|
||||
## benchmark-cpu: run benchmarks with CPU profiling
|
||||
.PHONY: benchmark-cpu
|
||||
benchmark-cpu: .tmp
|
||||
$(GO) test -tags=integration -bench=. -benchtime=10s -cpuprofile=.tmp/cpu.prof ./cmd/cheat && \
|
||||
$(RM) -f cheat.test && \
|
||||
echo "CPU profile saved to .tmp/cpu.prof" && \
|
||||
echo "View with: go tool pprof -http=:8080 .tmp/cpu.prof"
|
||||
|
||||
## benchmark-mem: run benchmarks with memory profiling
|
||||
.PHONY: benchmark-mem
|
||||
benchmark-mem: .tmp
|
||||
$(GO) test -tags=integration -bench=. -benchtime=10s -benchmem -memprofile=.tmp/mem.prof ./cmd/cheat && \
|
||||
$(RM) -f cheat.test && \
|
||||
echo "Memory profile saved to .tmp/mem.prof" && \
|
||||
echo "View with: go tool pprof -http=:8080 .tmp/mem.prof"
|
||||
|
||||
## check: format, lint, vet, vendor, and run unit-tests
|
||||
.PHONY: check
|
||||
check: | vendor fmt lint vet test
|
||||
|
||||
.PHONY: prepare
|
||||
prepare: | clean $(dist_dir) generate vendor fmt lint vet test
|
||||
prepare: | clean $(dist_dir) vendor fmt lint vet test
|
||||
|
||||
## docker-setup: create a docker image for use during development
|
||||
.PHONY: docker-setup
|
||||
|
||||
@@ -117,7 +117,7 @@ cheat tar # file is named "tar"
|
||||
cheat foo/bar # file is named "bar", in a "foo" subdirectory
|
||||
```
|
||||
|
||||
Cheatsheet text may optionally be preceeded by a YAML frontmatter header that
|
||||
Cheatsheet text may optionally be preceded by a YAML frontmatter header that
|
||||
assigns tags and specifies syntax:
|
||||
|
||||
```
|
||||
|
||||
@@ -1,92 +0,0 @@
|
||||
//go:build ignore
|
||||
// +build ignore
|
||||
|
||||
// This script embeds `docopt.txt and `conf.yml` into the binary during at
|
||||
// build time.
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
// get the cwd
|
||||
cwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// get the project root
|
||||
root, err := filepath.Abs(cwd + "../../../")
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// specify template file information
|
||||
type file struct {
|
||||
In string
|
||||
Out string
|
||||
Method string
|
||||
}
|
||||
|
||||
// enumerate the template files to process
|
||||
files := []file{
|
||||
file{
|
||||
In: "cmd/cheat/docopt.txt",
|
||||
Out: "cmd/cheat/str_usage.go",
|
||||
Method: "usage"},
|
||||
file{
|
||||
In: "configs/conf.yml",
|
||||
Out: "cmd/cheat/str_config.go",
|
||||
Method: "configs"},
|
||||
}
|
||||
|
||||
// iterate over each static file
|
||||
for _, file := range files {
|
||||
|
||||
// delete the outfile
|
||||
os.Remove(filepath.Join(root, file.Out))
|
||||
|
||||
// read the static template
|
||||
bytes, err := ioutil.ReadFile(filepath.Join(root, file.In))
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// render the template
|
||||
data := template(file.Method, string(bytes))
|
||||
|
||||
// write the file to the specified outpath
|
||||
spath := filepath.Join(root, file.Out)
|
||||
err = ioutil.WriteFile(spath, []byte(data), 0644)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// template packages the
|
||||
func template(method string, body string) string {
|
||||
|
||||
// specify the template string
|
||||
t := `package main
|
||||
|
||||
// Code generated .* DO NOT EDIT.
|
||||
|
||||
import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
func %s() string {
|
||||
return strings.TrimSpace(%s)
|
||||
}
|
||||
`
|
||||
|
||||
return fmt.Sprintf(t, method, "`"+body+"`")
|
||||
}
|
||||
37
build/fuzz.sh
Executable file
37
build/fuzz.sh
Executable file
@@ -0,0 +1,37 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Run fuzz tests for cheat
|
||||
# Usage: ./scripts/fuzz.sh [duration]
|
||||
#
|
||||
# Note: Go's fuzzer will fail immediately if it finds a known failing input
|
||||
# in the corpus (testdata/fuzz/*). This is by design - it ensures you fix
|
||||
# known bugs before searching for new ones. To see failing inputs:
|
||||
# ls internal/*/testdata/fuzz/*/
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
DURATION="${1:-15s}"
|
||||
|
||||
# Define fuzz tests: "TestName:Package:Description"
|
||||
TESTS=(
|
||||
"FuzzParse:./internal/sheet:YAML frontmatter parsing"
|
||||
"FuzzValidateSheetName:./internal/cheatpath:sheet name validation (path traversal protection)"
|
||||
"FuzzSearchRegex:./internal/sheet:regex search operations"
|
||||
"FuzzSearchCatastrophicBacktracking:./internal/sheet:catastrophic backtracking"
|
||||
"FuzzTagged:./internal/sheet:tag matching with malicious input"
|
||||
"FuzzFilter:./internal/sheets:tag filtering operations"
|
||||
"FuzzTags:./internal/sheets:tag aggregation and sorting"
|
||||
)
|
||||
|
||||
echo "Running fuzz tests ($DURATION each)..."
|
||||
echo
|
||||
|
||||
for i in "${!TESTS[@]}"; do
|
||||
IFS=':' read -r test_name package description <<< "${TESTS[$i]}"
|
||||
echo "$((i+1)). Testing $description..."
|
||||
go test -fuzz="^${test_name}$" -fuzztime="$DURATION" "$package"
|
||||
echo
|
||||
done
|
||||
|
||||
echo "All fuzz tests passed!"
|
||||
@@ -17,6 +17,12 @@ func cmdEdit(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
cheatsheet := opts["--edit"].(string)
|
||||
|
||||
// validate the cheatsheet name
|
||||
if err := cheatpath.ValidateSheetName(cheatsheet); err != nil {
|
||||
fmt.Fprintf(os.Stderr, "invalid cheatsheet name: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// load the cheatsheets
|
||||
cheatsheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
|
||||
@@ -5,15 +5,22 @@ import (
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/cheat/cheat/internal/cheatpath"
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
|
||||
// cmdRemove opens a cheatsheet for editing (or creates it if it doesn't exist).
|
||||
// cmdRemove removes (deletes) a cheatsheet.
|
||||
func cmdRemove(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
cheatsheet := opts["--rm"].(string)
|
||||
|
||||
// validate the cheatsheet name
|
||||
if err := cheatpath.ValidateSheetName(cheatsheet); err != nil {
|
||||
fmt.Fprintf(os.Stderr, "invalid cheatsheet name: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// load the cheatsheets
|
||||
cheatsheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
|
||||
@@ -31,6 +31,21 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
|
||||
)
|
||||
}
|
||||
|
||||
// prepare the search pattern
|
||||
pattern := "(?i)" + phrase
|
||||
|
||||
// unless --regex is provided, in which case we pass the regex unaltered
|
||||
if opts["--regex"] == true {
|
||||
pattern = phrase
|
||||
}
|
||||
|
||||
// compile the regex once, outside the loop
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to compile regexp: %s, %v\n", pattern, err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// iterate over each cheatpath
|
||||
out := ""
|
||||
for _, pathcheats := range cheatsheets {
|
||||
@@ -44,21 +59,6 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
|
||||
continue
|
||||
}
|
||||
|
||||
// assume that we want to perform a case-insensitive search for <phrase>
|
||||
pattern := "(?i)" + phrase
|
||||
|
||||
// unless --regex is provided, in which case we pass the regex unaltered
|
||||
if opts["--regex"] == true {
|
||||
pattern = phrase
|
||||
}
|
||||
|
||||
// compile the regex
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to compile regexp: %s, %v\n", pattern, err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// `Search` will return text entries that match the search terms.
|
||||
// We're using it here to overwrite the prior cheatsheet Text,
|
||||
// filtering it to only what is relevant.
|
||||
|
||||
73
cmd/cheat/config.go
Normal file
73
cmd/cheat/config.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package main
|
||||
|
||||
// configs returns the default configuration template
|
||||
func configs() string {
|
||||
return `---
|
||||
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
|
||||
editor: EDITOR_PATH
|
||||
|
||||
# Should 'cheat' always colorize output?
|
||||
colorize: false
|
||||
|
||||
# Which 'chroma' colorscheme should be applied to the output?
|
||||
# Options are available here:
|
||||
# https://github.com/alecthomas/chroma/tree/master/styles
|
||||
style: monokai
|
||||
|
||||
# Which 'chroma' "formatter" should be applied?
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal256
|
||||
|
||||
# Through which pager should output be piped?
|
||||
# 'less -FRX' is recommended on Unix systems
|
||||
# 'more' is recommended on Windows
|
||||
pager: PAGER_PATH
|
||||
|
||||
# The paths at which cheatsheets are available. Tags associated with a cheatpath
|
||||
# are automatically attached to all cheatsheets residing on that path.
|
||||
#
|
||||
# Whenever cheatsheets share the same title (like 'tar'), the most local
|
||||
# cheatsheets (those which come later in this file) take precedence over the
|
||||
# less local sheets. This allows you to create your own "overides" for
|
||||
# "upstream" cheatsheets.
|
||||
#
|
||||
# But what if you want to view the "upstream" cheatsheets instead of your own?
|
||||
# Cheatsheets may be filtered by 'tags' in combination with the '--tag' flag.
|
||||
#
|
||||
# Example: 'cheat tar --tag=community' will display the 'tar' cheatsheet that
|
||||
# is tagged as 'community' rather than your own.
|
||||
#
|
||||
# Paths that come earlier are considered to be the most "global", and paths
|
||||
# that come later are considered to be the most "local". The most "local" paths
|
||||
# take precedence.
|
||||
#
|
||||
# See: https://github.com/cheat/cheat/blob/master/doc/cheat.1.md#cheatpaths
|
||||
cheatpaths:
|
||||
|
||||
# Cheatsheets that are tagged "personal" are stored here by default:
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# Cheatsheets that are tagged "work" are stored here by default:
|
||||
- name: work
|
||||
path: WORK_PATH
|
||||
tags: [ work ]
|
||||
readonly: false
|
||||
|
||||
# Community cheatsheets are stored here by default:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# You can also use glob patterns to automatically load cheatsheets from all
|
||||
# directories that match.
|
||||
#
|
||||
# Example: overload cheatsheets for projects under ~/src/github.com/example/*/
|
||||
#- name: example-projects
|
||||
# path: ~/src/github.com/example/**/.cheat
|
||||
# tags: [ example ]
|
||||
# readonly: true`
|
||||
}
|
||||
@@ -1,59 +0,0 @@
|
||||
Usage:
|
||||
cheat [options] [<cheatsheet>]
|
||||
|
||||
Options:
|
||||
--init Write a default config file to stdout
|
||||
-a --all Search among all cheatpaths
|
||||
-c --colorize Colorize output
|
||||
-d --directories List cheatsheet directories
|
||||
-e --edit=<cheatsheet> Edit <cheatsheet>
|
||||
-l --list List cheatsheets
|
||||
-p --path=<name> Return only sheets found on cheatpath <name>
|
||||
-r --regex Treat search <phrase> as a regex
|
||||
-s --search=<phrase> Search cheatsheets for <phrase>
|
||||
-t --tag=<tag> Return only sheets matching <tag>
|
||||
-T --tags List all tags in use
|
||||
-v --version Print the version number
|
||||
--rm=<cheatsheet> Remove (delete) <cheatsheet>
|
||||
--conf Display the config file path
|
||||
|
||||
Examples:
|
||||
|
||||
To initialize a config file:
|
||||
mkdir -p ~/.config/cheat && cheat --init > ~/.config/cheat/conf.yml
|
||||
|
||||
To view the tar cheatsheet:
|
||||
cheat tar
|
||||
|
||||
To edit (or create) the foo cheatsheet:
|
||||
cheat -e foo
|
||||
|
||||
To edit (or create) the foo/bar cheatsheet on the "work" cheatpath:
|
||||
cheat -p work -e foo/bar
|
||||
|
||||
To view all cheatsheet directories:
|
||||
cheat -d
|
||||
|
||||
To list all available cheatsheets:
|
||||
cheat -l
|
||||
|
||||
To list all cheatsheets whose titles match "apt":
|
||||
cheat -l apt
|
||||
|
||||
To list all tags in use:
|
||||
cheat -T
|
||||
|
||||
To list available cheatsheets that are tagged as "personal":
|
||||
cheat -l -t personal
|
||||
|
||||
To search for "ssh" among all cheatsheets, and colorize matches:
|
||||
cheat -c -s ssh
|
||||
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -c -r -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
cheat --rm foo/bar
|
||||
|
||||
To view the configuration file path:
|
||||
cheat --conf
|
||||
@@ -1,8 +1,6 @@
|
||||
// Package main serves as the executable entrypoint.
|
||||
package main
|
||||
|
||||
//go:generate go run ../../build/embed.go
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
@@ -17,7 +15,7 @@ import (
|
||||
"github.com/cheat/cheat/internal/installer"
|
||||
)
|
||||
|
||||
const version = "4.4.1"
|
||||
const version = "4.5.0"
|
||||
|
||||
func main() {
|
||||
|
||||
@@ -45,6 +43,7 @@ func main() {
|
||||
// read the envvars into a map of strings
|
||||
envvars := map[string]string{}
|
||||
for _, e := range os.Environ() {
|
||||
// os.Environ() guarantees "key=value" format (see ADR-002)
|
||||
pair := strings.SplitN(e, "=", 2)
|
||||
if runtime.GOOS == "windows" {
|
||||
pair[0] = strings.ToUpper(pair[0])
|
||||
|
||||
216
cmd/cheat/path_traversal_integration_test.go
Normal file
216
cmd/cheat/path_traversal_integration_test.go
Normal file
@@ -0,0 +1,216 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestPathTraversalIntegration tests that the cheat binary properly blocks
|
||||
// path traversal attempts when invoked as a subprocess.
|
||||
func TestPathTraversalIntegration(t *testing.T) {
|
||||
// Build the cheat binary
|
||||
binPath := filepath.Join(t.TempDir(), "cheat_test")
|
||||
if output, err := exec.Command("go", "build", "-o", binPath, ".").CombinedOutput(); err != nil {
|
||||
t.Fatalf("Failed to build cheat: %v\nOutput: %s", err, output)
|
||||
}
|
||||
|
||||
// Set up test environment
|
||||
testDir := t.TempDir()
|
||||
sheetsDir := filepath.Join(testDir, "sheets")
|
||||
os.MkdirAll(sheetsDir, 0755)
|
||||
|
||||
// Create config
|
||||
config := fmt.Sprintf(`---
|
||||
editor: echo
|
||||
colorize: false
|
||||
pager: cat
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: %s
|
||||
readonly: false
|
||||
`, sheetsDir)
|
||||
|
||||
configPath := filepath.Join(testDir, "config.yml")
|
||||
if err := os.WriteFile(configPath, []byte(config), 0644); err != nil {
|
||||
t.Fatalf("Failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Test table
|
||||
tests := []struct {
|
||||
name string
|
||||
command []string
|
||||
wantFail bool
|
||||
wantMsg string
|
||||
}{
|
||||
// Blocked patterns
|
||||
{
|
||||
name: "block parent traversal edit",
|
||||
command: []string{"--edit", "../evil"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block absolute path edit",
|
||||
command: []string{"--edit", "/etc/passwd"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot be an absolute path",
|
||||
},
|
||||
{
|
||||
name: "block home dir edit",
|
||||
command: []string{"--edit", "~/.ssh/config"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot start with '~'",
|
||||
},
|
||||
{
|
||||
name: "block parent traversal remove",
|
||||
command: []string{"--rm", "../evil"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block complex traversal",
|
||||
command: []string{"--edit", "foo/../../bar"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block just dots",
|
||||
command: []string{"--edit", ".."},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block empty name",
|
||||
command: []string{"--edit", ""},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot be empty",
|
||||
},
|
||||
// Allowed patterns
|
||||
{
|
||||
name: "allow simple name",
|
||||
command: []string{"--edit", "docker"},
|
||||
wantFail: false,
|
||||
},
|
||||
{
|
||||
name: "allow nested name",
|
||||
command: []string{"--edit", "lang/go"},
|
||||
wantFail: false,
|
||||
},
|
||||
{
|
||||
name: "block hidden file",
|
||||
command: []string{"--edit", ".gitignore"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot start with '.'",
|
||||
},
|
||||
{
|
||||
name: "allow current dir",
|
||||
command: []string{"--edit", "./local"},
|
||||
wantFail: false,
|
||||
},
|
||||
}
|
||||
|
||||
// Run tests
|
||||
for _, tc := range tests {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
cmd := exec.Command(binPath, tc.command...)
|
||||
cmd.Env = []string{
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configPath),
|
||||
fmt.Sprintf("HOME=%s", testDir),
|
||||
}
|
||||
output, err := cmd.CombinedOutput()
|
||||
|
||||
if tc.wantFail {
|
||||
if err == nil {
|
||||
t.Errorf("Expected failure but command succeeded. Output: %s", output)
|
||||
}
|
||||
if !strings.Contains(string(output), "invalid cheatsheet name") {
|
||||
t.Errorf("Expected 'invalid cheatsheet name' error, got: %s", output)
|
||||
}
|
||||
if tc.wantMsg != "" && !strings.Contains(string(output), tc.wantMsg) {
|
||||
t.Errorf("Expected message %q in output, got: %s", tc.wantMsg, output)
|
||||
}
|
||||
} else {
|
||||
// Command might fail for other reasons (e.g., editor not found)
|
||||
// but should NOT fail with "invalid cheatsheet name"
|
||||
if strings.Contains(string(output), "invalid cheatsheet name") {
|
||||
t.Errorf("Command incorrectly blocked. Output: %s", output)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestPathTraversalRealWorld tests with more realistic scenarios
|
||||
func TestPathTraversalRealWorld(t *testing.T) {
|
||||
// This test ensures our protection works with actual file operations
|
||||
|
||||
// Build cheat
|
||||
binPath := filepath.Join(t.TempDir(), "cheat_test")
|
||||
if output, err := exec.Command("go", "build", "-o", binPath, ".").CombinedOutput(); err != nil {
|
||||
t.Fatalf("Failed to build: %v\n%s", err, output)
|
||||
}
|
||||
|
||||
// Create test structure
|
||||
testRoot := t.TempDir()
|
||||
sheetsDir := filepath.Join(testRoot, "cheatsheets")
|
||||
secretDir := filepath.Join(testRoot, "secrets")
|
||||
os.MkdirAll(sheetsDir, 0755)
|
||||
os.MkdirAll(secretDir, 0755)
|
||||
|
||||
// Create a "secret" file that should not be accessible
|
||||
secretFile := filepath.Join(secretDir, "secret.txt")
|
||||
os.WriteFile(secretFile, []byte("SECRET DATA"), 0644)
|
||||
|
||||
// Create config using vim in non-interactive mode
|
||||
config := fmt.Sprintf(`---
|
||||
editor: vim -u NONE -n --cmd "set noswapfile" --cmd "wq"
|
||||
colorize: false
|
||||
pager: cat
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: %s
|
||||
readonly: false
|
||||
`, sheetsDir)
|
||||
|
||||
configPath := filepath.Join(testRoot, "config.yml")
|
||||
os.WriteFile(configPath, []byte(config), 0644)
|
||||
|
||||
// Test 1: Try to edit a file outside cheatsheets using traversal
|
||||
cmd := exec.Command(binPath, "--edit", "../secrets/secret")
|
||||
cmd.Env = []string{
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configPath),
|
||||
fmt.Sprintf("HOME=%s", testRoot),
|
||||
}
|
||||
output, err := cmd.CombinedOutput()
|
||||
|
||||
if err == nil || !strings.Contains(string(output), "invalid cheatsheet name") {
|
||||
t.Errorf("Path traversal was not blocked! Output: %s", output)
|
||||
}
|
||||
|
||||
// Test 2: Verify the secret file is still intact
|
||||
content, _ := os.ReadFile(secretFile)
|
||||
if string(content) != "SECRET DATA" {
|
||||
t.Errorf("Secret file was modified!")
|
||||
}
|
||||
|
||||
// Test 3: Verify no files were created outside sheets directory
|
||||
err = filepath.Walk(testRoot, func(path string, info os.FileInfo, err error) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if !info.IsDir() &&
|
||||
path != configPath &&
|
||||
path != secretFile &&
|
||||
!strings.HasPrefix(path, sheetsDir) {
|
||||
t.Errorf("File created outside allowed directory: %s", path)
|
||||
}
|
||||
return nil
|
||||
})
|
||||
if err != nil {
|
||||
t.Errorf("Walk error: %v", err)
|
||||
}
|
||||
}
|
||||
209
cmd/cheat/search_bench_test.go
Normal file
209
cmd/cheat/search_bench_test.go
Normal file
@@ -0,0 +1,209 @@
|
||||
//go:build integration
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/go-git/go-git/v5"
|
||||
"github.com/go-git/go-git/v5/plumbing"
|
||||
)
|
||||
|
||||
// BenchmarkSearchCommand benchmarks the actual cheat search command
|
||||
func BenchmarkSearchCommand(b *testing.B) {
|
||||
// Build the cheat binary in .tmp (using absolute path)
|
||||
rootDir, err := filepath.Abs(filepath.Join("..", ".."))
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to get root dir: %v", err)
|
||||
}
|
||||
tmpDir := filepath.Join(rootDir, ".tmp", "bench-test")
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
b.Fatalf("Failed to create temp dir: %v", err)
|
||||
}
|
||||
|
||||
cheatBin := filepath.Join(tmpDir, "cheat-bench")
|
||||
|
||||
// Clean up the binary when done
|
||||
b.Cleanup(func() {
|
||||
os.Remove(cheatBin)
|
||||
})
|
||||
|
||||
cmd := exec.Command("go", "build", "-o", cheatBin, "./cmd/cheat")
|
||||
cmd.Dir = rootDir
|
||||
if output, err := cmd.CombinedOutput(); err != nil {
|
||||
b.Fatalf("Failed to build cheat: %v\nOutput: %s", err, output)
|
||||
}
|
||||
|
||||
// Set up test environment in .tmp
|
||||
configDir := filepath.Join(tmpDir, "config")
|
||||
cheatsheetDir := filepath.Join(configDir, "cheatsheets", "community")
|
||||
|
||||
// Clone community cheatsheets (or reuse if already exists)
|
||||
if _, err := os.Stat(cheatsheetDir); os.IsNotExist(err) {
|
||||
b.Logf("Cloning community cheatsheets to %s...", cheatsheetDir)
|
||||
_, err := git.PlainClone(cheatsheetDir, false, &git.CloneOptions{
|
||||
URL: "https://github.com/cheat/cheatsheets.git",
|
||||
Depth: 1,
|
||||
SingleBranch: true,
|
||||
ReferenceName: plumbing.ReferenceName("refs/heads/master"),
|
||||
Progress: nil,
|
||||
})
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to clone cheatsheets: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Create a minimal config file
|
||||
configFile := filepath.Join(configDir, "conf.yml")
|
||||
configContent := fmt.Sprintf(`---
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: %s
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
`, cheatsheetDir)
|
||||
|
||||
if err := os.MkdirAll(configDir, 0755); err != nil {
|
||||
b.Fatalf("Failed to create config dir: %v", err)
|
||||
}
|
||||
if err := os.WriteFile(configFile, []byte(configContent), 0644); err != nil {
|
||||
b.Fatalf("Failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Set environment to use our config
|
||||
env := append(os.Environ(),
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configFile),
|
||||
)
|
||||
|
||||
// Define test cases
|
||||
testCases := []struct {
|
||||
name string
|
||||
args []string
|
||||
}{
|
||||
{"SimpleSearch", []string{"-s", "echo"}},
|
||||
{"RegexSearch", []string{"-r", "-s", "^#.*example"}},
|
||||
{"ColorizedSearch", []string{"-c", "-s", "grep"}},
|
||||
{"ComplexRegex", []string{"-r", "-s", "(git|hg|svn)\\s+(add|commit|push)"}},
|
||||
{"AllCheatpaths", []string{"-a", "-s", "list"}},
|
||||
}
|
||||
|
||||
// Warm up - run once to ensure everything is loaded
|
||||
warmupCmd := exec.Command(cheatBin, "-l")
|
||||
warmupCmd.Env = env
|
||||
warmupCmd.Run()
|
||||
|
||||
// Run benchmarks
|
||||
for _, tc := range testCases {
|
||||
b.Run(tc.name, func(b *testing.B) {
|
||||
// Reset timer to exclude setup
|
||||
b.ResetTimer()
|
||||
|
||||
for i := 0; i < b.N; i++ {
|
||||
cmd := exec.Command(cheatBin, tc.args...)
|
||||
cmd.Env = env
|
||||
|
||||
// Capture output to prevent spamming
|
||||
var stdout, stderr bytes.Buffer
|
||||
cmd.Stdout = &stdout
|
||||
cmd.Stderr = &stderr
|
||||
|
||||
start := time.Now()
|
||||
err := cmd.Run()
|
||||
elapsed := time.Since(start)
|
||||
|
||||
if err != nil {
|
||||
b.Fatalf("Command failed: %v\nStderr: %s", err, stderr.String())
|
||||
}
|
||||
|
||||
// Report custom metric
|
||||
b.ReportMetric(float64(elapsed.Nanoseconds())/1e6, "ms/op")
|
||||
|
||||
// Ensure we got some results
|
||||
if stdout.Len() == 0 {
|
||||
b.Fatal("No output from search")
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// BenchmarkListCommand benchmarks the list command for comparison
|
||||
func BenchmarkListCommand(b *testing.B) {
|
||||
// Build the cheat binary in .tmp (using absolute path)
|
||||
rootDir, err := filepath.Abs(filepath.Join("..", ".."))
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to get root dir: %v", err)
|
||||
}
|
||||
tmpDir := filepath.Join(rootDir, ".tmp", "bench-test")
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
b.Fatalf("Failed to create temp dir: %v", err)
|
||||
}
|
||||
|
||||
cheatBin := filepath.Join(tmpDir, "cheat-bench")
|
||||
|
||||
// Clean up the binary when done
|
||||
b.Cleanup(func() {
|
||||
os.Remove(cheatBin)
|
||||
})
|
||||
|
||||
cmd := exec.Command("go", "build", "-o", cheatBin, "./cmd/cheat")
|
||||
cmd.Dir = rootDir
|
||||
if output, err := cmd.CombinedOutput(); err != nil {
|
||||
b.Fatalf("Failed to build cheat: %v\nOutput: %s", err, output)
|
||||
}
|
||||
|
||||
// Set up test environment (simplified - reuse if possible)
|
||||
configDir := filepath.Join(tmpDir, "config")
|
||||
cheatsheetDir := filepath.Join(configDir, "cheatsheets", "community")
|
||||
|
||||
// Check if we need to clone
|
||||
if _, err := os.Stat(cheatsheetDir); os.IsNotExist(err) {
|
||||
_, err := git.PlainClone(cheatsheetDir, false, &git.CloneOptions{
|
||||
URL: "https://github.com/cheat/cheatsheets.git",
|
||||
Depth: 1,
|
||||
SingleBranch: true,
|
||||
ReferenceName: plumbing.ReferenceName("refs/heads/master"),
|
||||
Progress: nil,
|
||||
})
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to clone cheatsheets: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Create config
|
||||
configFile := filepath.Join(configDir, "conf.yml")
|
||||
configContent := fmt.Sprintf(`---
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: %s
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
`, cheatsheetDir)
|
||||
|
||||
os.MkdirAll(configDir, 0755)
|
||||
os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
|
||||
env := append(os.Environ(),
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configFile),
|
||||
)
|
||||
|
||||
b.ResetTimer()
|
||||
|
||||
for i := 0; i < b.N; i++ {
|
||||
cmd := exec.Command(cheatBin, "-l")
|
||||
cmd.Env = env
|
||||
|
||||
var stdout bytes.Buffer
|
||||
cmd.Stdout = &stdout
|
||||
|
||||
if err := cmd.Run(); err != nil {
|
||||
b.Fatalf("Command failed: %v", err)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,93 +0,0 @@
|
||||
package main
|
||||
|
||||
// Code generated .* DO NOT EDIT.
|
||||
|
||||
import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
func configs() string {
|
||||
return strings.TrimSpace(`---
|
||||
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
|
||||
editor: EDITOR_PATH
|
||||
|
||||
# Should 'cheat' always colorize output?
|
||||
colorize: false
|
||||
|
||||
# Which 'chroma' colorscheme should be applied to the output?
|
||||
# Options are available here:
|
||||
# https://github.com/alecthomas/chroma/tree/master/styles
|
||||
style: monokai
|
||||
|
||||
# Which 'chroma' "formatter" should be applied?
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal256
|
||||
|
||||
# Through which pager should output be piped?
|
||||
# 'less -FRX' is recommended on Unix systems
|
||||
# 'more' is recommended on Windows
|
||||
pager: PAGER_PATH
|
||||
|
||||
# Cheatpaths are paths at which cheatsheets are available on your local
|
||||
# filesystem.
|
||||
#
|
||||
# It is useful to sort cheatsheets into different cheatpaths for organizational
|
||||
# purposes. For example, you might want one cheatpath for community
|
||||
# cheatsheets, one for personal cheatsheets, one for cheatsheets pertaining to
|
||||
# your day job, one for code snippets, etc.
|
||||
#
|
||||
# Cheatpaths are scoped, such that more "local" cheatpaths take priority over
|
||||
# more "global" cheatpaths. (The most global cheatpath is listed first in this
|
||||
# file; the most local is listed last.) For example, if there is a 'tar'
|
||||
# cheatsheet on both global and local paths, you'll be presented with the local
|
||||
# one by default. ('cheat -p' can be used to view cheatsheets from alternative
|
||||
# cheatpaths.)
|
||||
#
|
||||
# Cheatpaths can also be tagged as "read only". This instructs cheat not to
|
||||
# automatically create cheatsheets on a read-only cheatpath. Instead, when you
|
||||
# would like to edit a read-only cheatsheet using 'cheat -e', cheat will
|
||||
# perform a copy-on-write of that cheatsheet from a read-only cheatpath to a
|
||||
# writeable cheatpath.
|
||||
#
|
||||
# This is very useful when you would like to maintain, for example, a
|
||||
# "pristine" repository of community cheatsheets on one cheatpath, and an
|
||||
# editable personal reponsity of cheatsheets on another cheatpath.
|
||||
#
|
||||
# Cheatpaths can be also configured to automatically apply tags to cheatsheets
|
||||
# on certain paths, which can be useful for querying purposes.
|
||||
# Example: 'cheat -t work jenkins'.
|
||||
#
|
||||
# Community cheatsheets must be installed separately, though you may have
|
||||
# downloaded them automatically when installing 'cheat'. If not, you may
|
||||
# download them here:
|
||||
#
|
||||
# https://github.com/cheat/cheatsheets
|
||||
cheatpaths:
|
||||
# Cheatpath properties mean the following:
|
||||
# 'name': the name of the cheatpath (view with 'cheat -d', filter with 'cheat -p')
|
||||
# 'path': the filesystem path of the cheatsheet directory (view with 'cheat -d')
|
||||
# 'tags': tags that should be automatically applied to sheets on this path
|
||||
# 'readonly': shall user-created ('cheat -e') cheatsheets be saved here?
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# If you have personalized cheatsheets, list them last. They will take
|
||||
# precedence over the more global cheatsheets.
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# While it requires no configuration here, it's also worth noting that
|
||||
# cheat will automatically append directories named '.cheat' within the
|
||||
# current working directory to the 'cheatpath'. This can be very useful if
|
||||
# you'd like to closely associate cheatsheets with, for example, a directory
|
||||
# containing source code.
|
||||
#
|
||||
# Such "directory-scoped" cheatsheets will be treated as the most "local"
|
||||
# cheatsheets, and will override less "local" cheatsheets. Similarly,
|
||||
# directory-scoped cheatsheets will always be editable ('readonly: false').
|
||||
`)
|
||||
}
|
||||
@@ -1,13 +1,8 @@
|
||||
package main
|
||||
|
||||
// Code generated .* DO NOT EDIT.
|
||||
|
||||
import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
// usage returns the usage text for the cheat command
|
||||
func usage() string {
|
||||
return strings.TrimSpace(`Usage:
|
||||
return `Usage:
|
||||
cheat [options] [<cheatsheet>]
|
||||
|
||||
Options:
|
||||
@@ -65,6 +60,5 @@ Examples:
|
||||
cheat --rm foo/bar
|
||||
|
||||
To view the configuration file path:
|
||||
cheat --conf
|
||||
`)
|
||||
cheat --conf`
|
||||
}
|
||||
@@ -1,82 +0,0 @@
|
||||
---
|
||||
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
|
||||
editor: EDITOR_PATH
|
||||
|
||||
# Should 'cheat' always colorize output?
|
||||
colorize: false
|
||||
|
||||
# Which 'chroma' colorscheme should be applied to the output?
|
||||
# Options are available here:
|
||||
# https://github.com/alecthomas/chroma/tree/master/styles
|
||||
style: monokai
|
||||
|
||||
# Which 'chroma' "formatter" should be applied?
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal256
|
||||
|
||||
# Through which pager should output be piped?
|
||||
# 'less -FRX' is recommended on Unix systems
|
||||
# 'more' is recommended on Windows
|
||||
pager: PAGER_PATH
|
||||
|
||||
# Cheatpaths are paths at which cheatsheets are available on your local
|
||||
# filesystem.
|
||||
#
|
||||
# It is useful to sort cheatsheets into different cheatpaths for organizational
|
||||
# purposes. For example, you might want one cheatpath for community
|
||||
# cheatsheets, one for personal cheatsheets, one for cheatsheets pertaining to
|
||||
# your day job, one for code snippets, etc.
|
||||
#
|
||||
# Cheatpaths are scoped, such that more "local" cheatpaths take priority over
|
||||
# more "global" cheatpaths. (The most global cheatpath is listed first in this
|
||||
# file; the most local is listed last.) For example, if there is a 'tar'
|
||||
# cheatsheet on both global and local paths, you'll be presented with the local
|
||||
# one by default. ('cheat -p' can be used to view cheatsheets from alternative
|
||||
# cheatpaths.)
|
||||
#
|
||||
# Cheatpaths can also be tagged as "read only". This instructs cheat not to
|
||||
# automatically create cheatsheets on a read-only cheatpath. Instead, when you
|
||||
# would like to edit a read-only cheatsheet using 'cheat -e', cheat will
|
||||
# perform a copy-on-write of that cheatsheet from a read-only cheatpath to a
|
||||
# writeable cheatpath.
|
||||
#
|
||||
# This is very useful when you would like to maintain, for example, a
|
||||
# "pristine" repository of community cheatsheets on one cheatpath, and an
|
||||
# editable personal reponsity of cheatsheets on another cheatpath.
|
||||
#
|
||||
# Cheatpaths can be also configured to automatically apply tags to cheatsheets
|
||||
# on certain paths, which can be useful for querying purposes.
|
||||
# Example: 'cheat -t work jenkins'.
|
||||
#
|
||||
# Community cheatsheets must be installed separately, though you may have
|
||||
# downloaded them automatically when installing 'cheat'. If not, you may
|
||||
# download them here:
|
||||
#
|
||||
# https://github.com/cheat/cheatsheets
|
||||
cheatpaths:
|
||||
# Cheatpath properties mean the following:
|
||||
# 'name': the name of the cheatpath (view with 'cheat -d', filter with 'cheat -p')
|
||||
# 'path': the filesystem path of the cheatsheet directory (view with 'cheat -d')
|
||||
# 'tags': tags that should be automatically applied to sheets on this path
|
||||
# 'readonly': shall user-created ('cheat -e') cheatsheets be saved here?
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# If you have personalized cheatsheets, list them last. They will take
|
||||
# precedence over the more global cheatsheets.
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# While it requires no configuration here, it's also worth noting that
|
||||
# cheat will automatically append directories named '.cheat' within the
|
||||
# current working directory to the 'cheatpath'. This can be very useful if
|
||||
# you'd like to closely associate cheatsheets with, for example, a directory
|
||||
# containing source code.
|
||||
#
|
||||
# Such "directory-scoped" cheatsheets will be treated as the most "local"
|
||||
# cheatsheets, and will override less "local" cheatsheets. Similarly,
|
||||
# directory-scoped cheatsheets will always be editable ('readonly: false').
|
||||
169
doc/adr/001-path-traversal-protection.md
Normal file
169
doc/adr/001-path-traversal-protection.md
Normal file
@@ -0,0 +1,169 @@
|
||||
# ADR-001: Path Traversal Protection for Cheatsheet Names
|
||||
|
||||
Date: 2025-01-21
|
||||
|
||||
## Status
|
||||
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
|
||||
The `cheat` tool allows users to create, edit, and remove cheatsheets using commands like:
|
||||
- `cheat --edit <name>`
|
||||
- `cheat --rm <name>`
|
||||
|
||||
Without validation, a user could potentially provide malicious names like:
|
||||
- `../../../etc/passwd` (directory traversal)
|
||||
- `/etc/passwd` (absolute path)
|
||||
- `~/.ssh/authorized_keys` (home directory expansion)
|
||||
|
||||
While `cheat` is a local tool run by the user themselves (not a network service), path traversal could still lead to:
|
||||
1. Accidental file overwrites outside cheatsheet directories
|
||||
2. Confusion about where files are being created
|
||||
3. Potential security issues in shared environments
|
||||
|
||||
## Decision
|
||||
|
||||
We implemented input validation for cheatsheet names to prevent directory traversal attacks. The validation rejects names that:
|
||||
|
||||
1. Contain `..` (parent directory references)
|
||||
2. Are absolute paths (start with `/` on Unix)
|
||||
3. Start with `~` (home directory expansion)
|
||||
4. Are empty
|
||||
5. Start with `.` (hidden files - these are not displayed by cheat)
|
||||
|
||||
The validation is performed at the application layer before any file operations occur.
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### Validation Function
|
||||
|
||||
The validation is implemented in `internal/cheatpath/validate.go`:
|
||||
|
||||
```go
|
||||
func ValidateSheetName(name string) error {
|
||||
// Reject empty names
|
||||
if name == "" {
|
||||
return fmt.Errorf("cheatsheet name cannot be empty")
|
||||
}
|
||||
|
||||
// Reject names containing directory traversal
|
||||
if strings.Contains(name, "..") {
|
||||
return fmt.Errorf("cheatsheet name cannot contain '..'")
|
||||
}
|
||||
|
||||
// Reject absolute paths
|
||||
if filepath.IsAbs(name) {
|
||||
return fmt.Errorf("cheatsheet name cannot be an absolute path")
|
||||
}
|
||||
|
||||
// Reject names that start with ~ (home directory expansion)
|
||||
if strings.HasPrefix(name, "~") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '~'")
|
||||
}
|
||||
|
||||
// Reject hidden files (files that start with a dot)
|
||||
filename := filepath.Base(name)
|
||||
if strings.HasPrefix(filename, ".") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '.' (hidden files are not supported)")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
```
|
||||
|
||||
### Integration Points
|
||||
|
||||
The validation is called in:
|
||||
- `cmd/cheat/cmd_edit.go` - before creating or editing a cheatsheet
|
||||
- `cmd/cheat/cmd_remove.go` - before removing a cheatsheet
|
||||
|
||||
### Allowed Patterns
|
||||
|
||||
The following patterns are explicitly allowed:
|
||||
- Simple names: `docker`, `git`
|
||||
- Nested paths: `docker/compose`, `lang/go/slice`
|
||||
- Current directory references: `./mysheet`
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
|
||||
1. **Safety**: Prevents accidental or intentional file operations outside cheatsheet directories
|
||||
2. **Simplicity**: Validation happens early, before any file operations
|
||||
3. **User-friendly**: Clear error messages explain why a name was rejected
|
||||
4. **Performance**: Minimal overhead - simple string checks
|
||||
5. **Compatibility**: Doesn't break existing valid cheatsheet names
|
||||
|
||||
### Negative
|
||||
|
||||
1. **Limitation**: Users cannot use `..` in cheatsheet names even if legitimate
|
||||
2. **No symlink support**: Cannot create cheatsheets through symlinks outside the cheatpath
|
||||
|
||||
### Neutral
|
||||
|
||||
1. Uses Go's `filepath.IsAbs()` which handles platform differences (Windows vs Unix)
|
||||
2. No attempt to resolve or canonicalize paths - validation is purely syntactic
|
||||
|
||||
## Security Considerations
|
||||
|
||||
### Threat Model
|
||||
|
||||
`cheat` is a local command-line tool, not a network service. The primary threats are:
|
||||
- User error (accidentally overwriting important files)
|
||||
- Malicious scripts that invoke `cheat` with crafted arguments
|
||||
- Shared system scenarios where cheatsheets might be shared
|
||||
|
||||
### What This Protects Against
|
||||
|
||||
- Directory traversal using `../`
|
||||
- Absolute path access to system files
|
||||
- Shell expansion of `~` to home directory
|
||||
- Empty names that might cause unexpected behavior
|
||||
- Hidden files that wouldn't be displayed anyway
|
||||
|
||||
### What This Does NOT Protect Against
|
||||
|
||||
- Users with filesystem permissions can still directly edit any file
|
||||
- Symbolic links within the cheatpath pointing outside
|
||||
- Race conditions (TOCTOU) - though minimal risk for a local tool
|
||||
- Malicious content within cheatsheets themselves
|
||||
|
||||
## Testing
|
||||
|
||||
Comprehensive tests ensure the validation works correctly:
|
||||
|
||||
1. **Unit tests** (`internal/cheatpath/validate_test.go`) verify the validation logic
|
||||
2. **Integration tests** verify the actual binary blocks malicious inputs
|
||||
3. **No system files are accessed** during testing - all tests use isolated directories
|
||||
|
||||
Example test cases:
|
||||
```bash
|
||||
# These are blocked:
|
||||
cheat --edit "../../../etc/passwd"
|
||||
cheat --edit "/etc/passwd"
|
||||
cheat --edit "~/.ssh/config"
|
||||
cheat --rm ".."
|
||||
|
||||
# These are allowed:
|
||||
cheat --edit "docker"
|
||||
cheat --edit "docker/compose"
|
||||
cheat --edit "./local"
|
||||
```
|
||||
|
||||
## Alternative Approaches Considered
|
||||
|
||||
1. **Path resolution and verification**: Resolve the final path and check if it's within the cheatpath
|
||||
- Rejected: More complex, potential race conditions, platform-specific edge cases
|
||||
|
||||
2. **Chroot/sandbox**: Run file operations in a restricted environment
|
||||
- Rejected: Overkill for a local tool, platform compatibility issues
|
||||
|
||||
3. **Filename allowlist**: Only allow alphanumeric characters and specific symbols
|
||||
- Rejected: Too restrictive, would break existing cheatsheets with valid special characters
|
||||
|
||||
## References
|
||||
|
||||
- OWASP Path Traversal: https://owasp.org/www-community/attacks/Path_Traversal
|
||||
- CWE-22: Improper Limitation of a Pathname to a Restricted Directory
|
||||
- Go filepath package documentation: https://pkg.go.dev/path/filepath
|
||||
100
doc/adr/002-environment-variable-parsing.md
Normal file
100
doc/adr/002-environment-variable-parsing.md
Normal file
@@ -0,0 +1,100 @@
|
||||
# ADR-002: No Defensive Checks for Environment Variable Parsing
|
||||
|
||||
Date: 2025-01-21
|
||||
|
||||
## Status
|
||||
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
|
||||
In `cmd/cheat/main.go` lines 47-52, the code parses environment variables assuming they all contain an equals sign:
|
||||
|
||||
```go
|
||||
for _, e := range os.Environ() {
|
||||
pair := strings.SplitN(e, "=", 2)
|
||||
if runtime.GOOS == "windows" {
|
||||
pair[0] = strings.ToUpper(pair[0])
|
||||
}
|
||||
envvars[pair[0]] = pair[1] // Could panic if pair has < 2 elements
|
||||
}
|
||||
```
|
||||
|
||||
If `os.Environ()` returned a string without an equals sign, `strings.SplitN` would return a slice with only one element, causing a panic when accessing `pair[1]`.
|
||||
|
||||
## Decision
|
||||
|
||||
We will **not** add defensive checks for this condition. The current code that assumes all environment strings contain "=" will remain unchanged.
|
||||
|
||||
## Rationale
|
||||
|
||||
### Go Runtime Guarantees
|
||||
|
||||
Go's official documentation guarantees that `os.Environ()` returns environment variables in the form "key=value". This is a documented contract of the Go runtime that has been stable since Go 1.0.
|
||||
|
||||
### Empirical Evidence
|
||||
|
||||
Testing across platforms confirms:
|
||||
- All environment variables returned by `os.Environ()` contain at least one "="
|
||||
- Empty environment variables appear as "KEY=" (with an empty value)
|
||||
- Even Windows special variables like "=C:=C:\path" maintain the format
|
||||
|
||||
### Cost-Benefit Analysis
|
||||
|
||||
Adding defensive code would:
|
||||
- **Cost**: Add complexity and cognitive overhead
|
||||
- **Cost**: Suggest uncertainty about Go's documented behavior
|
||||
- **Cost**: Create dead code that can never execute under normal conditions
|
||||
- **Benefit**: Protect against a theoretical scenario that violates Go's guarantees
|
||||
|
||||
The only scenarios where this could panic are:
|
||||
1. A bug in Go's runtime (extremely unlikely, would affect all Go programs)
|
||||
2. Corrupted OS-level environment (would cause broader system issues)
|
||||
3. Breaking change in future Go version (would break many programs, unlikely)
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
- Simpler, more readable code
|
||||
- Trust in platform guarantees reduces unnecessary defensive programming
|
||||
- No performance overhead from unnecessary checks
|
||||
|
||||
### Negative
|
||||
- Theoretical panic if Go's guarantees are violated
|
||||
|
||||
### Neutral
|
||||
- Follows Go community standards of trusting standard library contracts
|
||||
|
||||
## Alternatives Considered
|
||||
|
||||
### 1. Add Defensive Check
|
||||
```go
|
||||
if len(pair) < 2 {
|
||||
continue // or pair[1] = ""
|
||||
}
|
||||
```
|
||||
**Rejected**: Adds complexity for a condition that should never occur.
|
||||
|
||||
### 2. Add Panic with Clear Message
|
||||
```go
|
||||
if len(pair) < 2 {
|
||||
panic("os.Environ() contract violation: " + e)
|
||||
}
|
||||
```
|
||||
**Rejected**: Would crash the program for the same theoretical issue.
|
||||
|
||||
### 3. Add Comment Documenting Assumption
|
||||
```go
|
||||
// os.Environ() guarantees "key=value" format, so pair[1] is safe
|
||||
envvars[pair[0]] = pair[1]
|
||||
```
|
||||
**Rejected**: While documentation is good, this particular guarantee is fundamental to Go.
|
||||
|
||||
## Notes
|
||||
|
||||
If Go ever changes this behavior (extremely unlikely as it would break compatibility), it would be caught immediately in testing as the program would panic on startup. This would be a clear signal to revisit this decision.
|
||||
|
||||
## References
|
||||
|
||||
- Go os.Environ() documentation: https://pkg.go.dev/os#Environ
|
||||
- Go os.Environ() source code and tests
|
||||
104
doc/adr/003-search-parallelization.md
Normal file
104
doc/adr/003-search-parallelization.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# ADR-003: No Parallelization for Search Operations
|
||||
|
||||
Date: 2025-01-22
|
||||
|
||||
## Status
|
||||
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
|
||||
We investigated optimizing cheat's search performance through parallelization. Initial assumptions suggested that I/O operations (reading multiple cheatsheet files) would be the primary bottleneck, making parallel processing beneficial.
|
||||
|
||||
Performance benchmarks were implemented to measure search operations, and a parallel search implementation using goroutines was created and tested.
|
||||
|
||||
## Decision
|
||||
|
||||
We will **not** implement parallel search. The sequential implementation will remain unchanged.
|
||||
|
||||
## Rationale
|
||||
|
||||
### Performance Profile Analysis
|
||||
|
||||
CPU profiling revealed that search performance is dominated by:
|
||||
- **Process creation overhead** (~30% in `os/exec.(*Cmd).Run`)
|
||||
- **System calls** (~30% in `syscall.Syscall6`)
|
||||
- **Process management** (fork, exec, pipe setup)
|
||||
|
||||
The actual search logic (regex matching, file reading) was negligible in the profile, indicating our optimization efforts were targeting the wrong bottleneck.
|
||||
|
||||
### Benchmark Results
|
||||
|
||||
Parallel implementation showed minimal improvements:
|
||||
- Simple search: 17ms → 15.3ms (10% improvement)
|
||||
- Regex search: 15ms → 14.9ms (minimal improvement)
|
||||
- Colorized search: 19.5ms → 16.8ms (14% improvement)
|
||||
- Complex regex: 20ms → 15.3ms (24% improvement)
|
||||
|
||||
The best case saved only ~5ms in absolute terms.
|
||||
|
||||
### Cost-Benefit Analysis
|
||||
|
||||
**Costs of parallelization:**
|
||||
- Added complexity with goroutines, channels, and synchronization
|
||||
- Increased maintenance burden
|
||||
- More difficult debugging and testing
|
||||
- Potential race conditions
|
||||
|
||||
**Benefits:**
|
||||
- 5-15% performance improvement (5ms in real terms)
|
||||
- Imperceptible to users in interactive use
|
||||
|
||||
### User Experience Perspective
|
||||
|
||||
For a command-line tool:
|
||||
- Current 15-20ms response time is excellent
|
||||
- Users cannot perceive 5ms differences
|
||||
- Sub-50ms is considered "instant" in HCI research
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
- Simpler, more maintainable codebase
|
||||
- Easier to debug and reason about
|
||||
- No synchronization bugs or race conditions
|
||||
- Focus remains on code clarity
|
||||
|
||||
### Negative
|
||||
- Missed opportunity for ~5ms performance gain
|
||||
- Search remains single-threaded
|
||||
|
||||
### Neutral
|
||||
- Performance remains excellent for intended use case
|
||||
- Follows Go philosophy of preferring simplicity
|
||||
|
||||
## Alternatives Considered
|
||||
|
||||
### 1. Keep Parallel Implementation
|
||||
**Rejected**: Complexity outweighs negligible performance gains.
|
||||
|
||||
### 2. Optimize Process Startup
|
||||
**Rejected**: Process creation overhead is inherent to CLI tools and cannot be avoided without fundamental architecture changes.
|
||||
|
||||
### 3. Future Optimizations
|
||||
If performance becomes critical, consider:
|
||||
- **Long-running daemon**: Eliminate process startup overhead entirely
|
||||
- **Shell function**: Reduce fork/exec overhead
|
||||
- **Compiled-in cheatsheets**: Eliminate file I/O
|
||||
|
||||
However, these would fundamentally change the tool's architecture and usage model.
|
||||
|
||||
## Notes
|
||||
|
||||
This decision reinforces important principles:
|
||||
1. Always profile before optimizing
|
||||
2. Consider the full execution context
|
||||
3. Measure what matters to users
|
||||
4. Complexity has a real cost
|
||||
|
||||
The parallelization attempt was valuable as a learning exercise and definitively answered whether this optimization path was worthwhile.
|
||||
|
||||
## References
|
||||
|
||||
- Benchmark implementation: cmd/cheat/search_bench_test.go
|
||||
- Reverted parallel implementation: see git history (commit 82eb918)
|
||||
38
doc/adr/README.md
Normal file
38
doc/adr/README.md
Normal file
@@ -0,0 +1,38 @@
|
||||
# Architecture Decision Records
|
||||
|
||||
This directory contains Architecture Decision Records (ADRs) for the cheat project.
|
||||
|
||||
## What is an ADR?
|
||||
|
||||
An Architecture Decision Record captures an important architectural decision made along with its context and consequences. ADRs help us:
|
||||
|
||||
- Document why decisions were made
|
||||
- Understand the context and trade-offs
|
||||
- Review decisions when requirements change
|
||||
- Onboard new contributors
|
||||
|
||||
## ADR Format
|
||||
|
||||
Each ADR follows this template:
|
||||
|
||||
1. **Title**: ADR-NNN: Brief description
|
||||
2. **Date**: When the decision was made
|
||||
3. **Status**: Proposed, Accepted, Deprecated, Superseded
|
||||
4. **Context**: What prompted this decision?
|
||||
5. **Decision**: What did we decide to do?
|
||||
6. **Consequences**: What are the positive, negative, and neutral outcomes?
|
||||
|
||||
## Index of ADRs
|
||||
|
||||
| ADR | Title | Status | Date |
|
||||
|-----|-------|--------|------|
|
||||
| [001](001-path-traversal-protection.md) | Path Traversal Protection for Cheatsheet Names | Accepted | 2025-01-21 |
|
||||
| [002](002-environment-variable-parsing.md) | No Defensive Checks for Environment Variable Parsing | Accepted | 2025-01-21 |
|
||||
| [003](003-search-parallelization.md) | No Parallelization for Search Operations | Accepted | 2025-01-22 |
|
||||
|
||||
## Creating a New ADR
|
||||
|
||||
1. Copy the template from an existing ADR
|
||||
2. Use the next sequential number
|
||||
3. Fill in all sections
|
||||
4. Include the ADR alongside the commit implementing the decision
|
||||
98
doc/cheat.1
98
doc/cheat.1
@@ -1,31 +1,14 @@
|
||||
.\" Automatically generated by Pandoc 2.17.1.1
|
||||
.\" Automatically generated by Pandoc 3.1.11.1
|
||||
.\"
|
||||
.\" Define V font for inline verbatim, using C font in formats
|
||||
.\" that render this, and otherwise B font.
|
||||
.ie "\f[CB]x\f[]"x" \{\
|
||||
. ftr V B
|
||||
. ftr VI BI
|
||||
. ftr VB B
|
||||
. ftr VBI BI
|
||||
.\}
|
||||
.el \{\
|
||||
. ftr V CR
|
||||
. ftr VI CI
|
||||
. ftr VB CB
|
||||
. ftr VBI CBI
|
||||
.\}
|
||||
.TH "CHEAT" "1" "" "" "General Commands Manual"
|
||||
.hy
|
||||
.SH NAME
|
||||
.PP
|
||||
\f[B]cheat\f[R] \[em] create and view command-line cheatsheets
|
||||
\f[B]cheat\f[R] \[em] create and view command\-line cheatsheets
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\f[B]cheat\f[R] [options] [\f[I]CHEATSHEET\f[R]]
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
\f[B]cheat\f[R] allows you to create and view interactive cheatsheets on
|
||||
the command-line.
|
||||
the command\-line.
|
||||
It was designed to help remind *nix system administrators of options for
|
||||
commands that they use frequently, but not frequently enough to
|
||||
remember.
|
||||
@@ -34,34 +17,40 @@ remember.
|
||||
\[en]init
|
||||
Print a config file to stdout.
|
||||
.TP
|
||||
-c, \[en]colorize
|
||||
\[en]conf
|
||||
Display the config file path.
|
||||
.TP
|
||||
\-a, \[en]all
|
||||
Search among all cheatpaths.
|
||||
.TP
|
||||
\-c, \[en]colorize
|
||||
Colorize output.
|
||||
.TP
|
||||
-d, \[en]directories
|
||||
\-d, \[en]directories
|
||||
List cheatsheet directories.
|
||||
.TP
|
||||
-e, \[en]edit=\f[I]CHEATSHEET\f[R]
|
||||
\-e, \[en]edit=\f[I]CHEATSHEET\f[R]
|
||||
Open \f[I]CHEATSHEET\f[R] for editing.
|
||||
.TP
|
||||
-l, \[en]list
|
||||
\-l, \[en]list
|
||||
List available cheatsheets.
|
||||
.TP
|
||||
-p, \[en]path=\f[I]PATH\f[R]
|
||||
\-p, \[en]path=\f[I]PATH\f[R]
|
||||
Filter only to sheets found on path \f[I]PATH\f[R].
|
||||
.TP
|
||||
-r, \[en]regex
|
||||
\-r, \[en]regex
|
||||
Treat search \f[I]PHRASE\f[R] as a regular expression.
|
||||
.TP
|
||||
-s, \[en]search=\f[I]PHRASE\f[R]
|
||||
\-s, \[en]search=\f[I]PHRASE\f[R]
|
||||
Search cheatsheets for \f[I]PHRASE\f[R].
|
||||
.TP
|
||||
-t, \[en]tag=\f[I]TAG\f[R]
|
||||
\-t, \[en]tag=\f[I]TAG\f[R]
|
||||
Filter only to sheets tagged with \f[I]TAG\f[R].
|
||||
.TP
|
||||
-T, \[en]tags
|
||||
\-T, \[en]tags
|
||||
List all tags in use.
|
||||
.TP
|
||||
-v, \[en]version
|
||||
\-v, \[en]version
|
||||
Print the version number.
|
||||
.TP
|
||||
\[en]rm=\f[I]CHEATSHEET\f[R]
|
||||
@@ -72,37 +61,39 @@ To view the foo cheatsheet:
|
||||
cheat \f[I]foo\f[R]
|
||||
.TP
|
||||
To edit (or create) the foo cheatsheet:
|
||||
cheat -e \f[I]foo\f[R]
|
||||
cheat \-e \f[I]foo\f[R]
|
||||
.TP
|
||||
To edit (or create) the foo/bar cheatsheet on the `work' cheatpath:
|
||||
cheat -p \f[I]work\f[R] -e \f[I]foo/bar\f[R]
|
||||
cheat \-p \f[I]work\f[R] \-e \f[I]foo/bar\f[R]
|
||||
.TP
|
||||
To view all cheatsheet directories:
|
||||
cheat -d
|
||||
cheat \-d
|
||||
.TP
|
||||
To list all available cheatsheets:
|
||||
cheat -l
|
||||
cheat \-l
|
||||
.TP
|
||||
To list all cheatsheets whose titles match `apt':
|
||||
cheat -l \f[I]apt\f[R]
|
||||
cheat \-l \f[I]apt\f[R]
|
||||
.TP
|
||||
To list all tags in use:
|
||||
cheat -T
|
||||
cheat \-T
|
||||
.TP
|
||||
To list available cheatsheets that are tagged as `personal':
|
||||
cheat -l -t \f[I]personal\f[R]
|
||||
cheat \-l \-t \f[I]personal\f[R]
|
||||
.TP
|
||||
To search for `ssh' among all cheatsheets, and colorize matches:
|
||||
cheat -c -s \f[I]ssh\f[R]
|
||||
cheat \-c \-s \f[I]ssh\f[R]
|
||||
.TP
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -c -r -s \f[I]`(?:[0-9]{1,3}.){3}[0-9]{1,3}'\f[R]
|
||||
cheat \-c \-r \-s \f[I]`(?:[0\-9]{1,3}.){3}[0\-9]{1,3}'\f[R]
|
||||
.TP
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
cheat \[en]rm \f[I]foo/bar\f[R]
|
||||
.TP
|
||||
To view the configuration file path:
|
||||
cheat \[en]conf
|
||||
.SH FILES
|
||||
.SS Configuration
|
||||
.PP
|
||||
\f[B]cheat\f[R] is configured via a YAML file that is conventionally
|
||||
named \f[I]conf.yaml\f[R].
|
||||
\f[B]cheat\f[R] will search for \f[I]conf.yaml\f[R] in varying
|
||||
@@ -133,24 +124,28 @@ Alternatively, you may also generate a config file manually by running
|
||||
\f[B]cheat \[en]init\f[R] and saving its output to the appropriate
|
||||
location for your platform.
|
||||
.SS Cheatpaths
|
||||
.PP
|
||||
\f[B]cheat\f[R] reads its cheatsheets from \[lq]cheatpaths\[rq], which
|
||||
are the directories in which cheatsheets are stored.
|
||||
Cheatpaths may be configured in \f[I]conf.yaml\f[R], and viewed via
|
||||
\f[B]cheat -d\f[R].
|
||||
\f[B]cheat \-d\f[R].
|
||||
.PP
|
||||
For detailed instructions on how to configure cheatpaths, please refer
|
||||
to the comments in conf.yml.
|
||||
.SS Autocompletion
|
||||
.PP
|
||||
Autocompletion scripts for \f[B]bash\f[R], \f[B]zsh\f[R], and
|
||||
\f[B]fish\f[R] are available for download:
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.bash>
|
||||
\c
|
||||
.UR https://github.com/cheat/cheat/blob/master/scripts/cheat.bash
|
||||
.UE \c
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.fish>
|
||||
\c
|
||||
.UR https://github.com/cheat/cheat/blob/master/scripts/cheat.fish
|
||||
.UE \c
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.zsh>
|
||||
\c
|
||||
.UR https://github.com/cheat/cheat/blob/master/scripts/cheat.zsh
|
||||
.UE \c
|
||||
.PP
|
||||
The \f[B]bash\f[R] and \f[B]zsh\f[R] scripts provide optional
|
||||
integration with \f[B]fzf\f[R], if the latter is available on your
|
||||
@@ -176,11 +171,12 @@ Application error
|
||||
.IP "2." 3
|
||||
Cheatsheet(s) not found
|
||||
.SH BUGS
|
||||
.PP
|
||||
See GitHub issues: <https://github.com/cheat/cheat/issues>
|
||||
See GitHub issues: \c
|
||||
.UR https://github.com/cheat/cheat/issues
|
||||
.UE \c
|
||||
.SH AUTHOR
|
||||
.PP
|
||||
Christopher Allen Lane <chris@chris-allen-lane.com>
|
||||
Christopher Allen Lane \c
|
||||
.MT chris@chris-allen-lane.com
|
||||
.ME \c
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\f[B]fzf(1)\f[R]
|
||||
|
||||
@@ -23,6 +23,12 @@ OPTIONS
|
||||
--init
|
||||
: Print a config file to stdout.
|
||||
|
||||
--conf
|
||||
: Display the config file path.
|
||||
|
||||
-a, --all
|
||||
: Search among all cheatpaths.
|
||||
|
||||
-c, --colorize
|
||||
: Colorize output.
|
||||
|
||||
@@ -93,6 +99,9 @@ To search (by regex) for cheatsheets that contain an IP address:
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
: cheat --rm _foo/bar_
|
||||
|
||||
To view the configuration file path:
|
||||
: cheat --conf
|
||||
|
||||
|
||||
FILES
|
||||
=====
|
||||
|
||||
8
go.mod
8
go.mod
@@ -3,20 +3,20 @@ module github.com/cheat/cheat
|
||||
go 1.19
|
||||
|
||||
require (
|
||||
github.com/alecthomas/chroma v0.10.0
|
||||
github.com/alecthomas/chroma/v2 v2.12.0
|
||||
github.com/davecgh/go-spew v1.1.1
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815
|
||||
github.com/go-git/go-git/v5 v5.11.0
|
||||
github.com/mattn/go-isatty v0.0.20
|
||||
github.com/mitchellh/go-homedir v1.1.0
|
||||
gopkg.in/yaml.v2 v2.4.0
|
||||
gopkg.in/yaml.v3 v3.0.1
|
||||
)
|
||||
|
||||
require (
|
||||
dario.cat/mergo v1.0.0 // indirect
|
||||
github.com/Microsoft/go-winio v0.6.1 // indirect
|
||||
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c // indirect
|
||||
github.com/cloudflare/circl v1.3.6 // indirect
|
||||
github.com/cloudflare/circl v1.3.7 // indirect
|
||||
github.com/cyphar/filepath-securejoin v0.2.4 // indirect
|
||||
github.com/dlclark/regexp2 v1.10.0 // indirect
|
||||
github.com/emirpasic/gods v1.18.1 // indirect
|
||||
@@ -29,7 +29,7 @@ require (
|
||||
github.com/sergi/go-diff v1.3.1 // indirect
|
||||
github.com/skeema/knownhosts v1.2.1 // indirect
|
||||
github.com/xanzy/ssh-agent v0.3.3 // indirect
|
||||
golang.org/x/crypto v0.16.0 // indirect
|
||||
golang.org/x/crypto v0.17.0 // indirect
|
||||
golang.org/x/mod v0.14.0 // indirect
|
||||
golang.org/x/net v0.19.0 // indirect
|
||||
golang.org/x/sys v0.15.0 // indirect
|
||||
|
||||
19
go.sum
19
go.sum
@@ -5,20 +5,21 @@ github.com/Microsoft/go-winio v0.6.1 h1:9/kr64B9VUZrLm5YYwbGtUJnMgqWVOdUAXu6Migc
|
||||
github.com/Microsoft/go-winio v0.6.1/go.mod h1:LRdKpFKfdobln8UmuiYcKPot9D2v6svN5+sAH+4kjUM=
|
||||
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c h1:kMFnB0vCcX7IL/m9Y5LO+KQYv+t1CQOiFe6+SV2J7bE=
|
||||
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c/go.mod h1:EjAoLdwvbIOoOQr3ihjnSoLZRtE8azugULFRteWMNc0=
|
||||
github.com/alecthomas/chroma v0.10.0 h1:7XDcGkCQopCNKjZHfYrNLraA+M7e0fMiJ/Mfikbfjek=
|
||||
github.com/alecthomas/chroma v0.10.0/go.mod h1:jtJATyUxlIORhUOFNA9NZDWGAQ8wpxQQqNSB4rjA/1s=
|
||||
github.com/alecthomas/assert/v2 v2.2.1 h1:XivOgYcduV98QCahG8T5XTezV5bylXe+lBxLG2K2ink=
|
||||
github.com/alecthomas/chroma/v2 v2.12.0 h1:Wh8qLEgMMsN7mgyG8/qIpegky2Hvzr4By6gEF7cmWgw=
|
||||
github.com/alecthomas/chroma/v2 v2.12.0/go.mod h1:4TQu7gdfuPjSh76j78ietmqh9LiurGF0EpseFXdKMBw=
|
||||
github.com/alecthomas/repr v0.2.0 h1:HAzS41CIzNW5syS8Mf9UwXhNH1J9aix/BvDRf1Ml2Yk=
|
||||
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
|
||||
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
|
||||
github.com/bwesterb/go-ristretto v1.2.3/go.mod h1:fUIoIZaG73pV5biE2Blr2xEzDoMj7NFEuV9ekS419A0=
|
||||
github.com/cloudflare/circl v1.3.3/go.mod h1:5XYMA4rFBvNIrhs50XuiBJ15vF2pZn4nnUKZrLbUZFA=
|
||||
github.com/cloudflare/circl v1.3.6 h1:/xbKIqSHbZXHwkhbrhrt2YOHIwYJlXH94E3tI/gDlUg=
|
||||
github.com/cloudflare/circl v1.3.6/go.mod h1:5XYMA4rFBvNIrhs50XuiBJ15vF2pZn4nnUKZrLbUZFA=
|
||||
github.com/cloudflare/circl v1.3.7 h1:qlCDlTPz2n9fu58M0Nh1J/JzcFpfgkFHHX3O35r5vcU=
|
||||
github.com/cloudflare/circl v1.3.7/go.mod h1:sRTcRWXGLrKw6yIGJ+l7amYJFfAXbZG0kBSc8r4zxgA=
|
||||
github.com/cyphar/filepath-securejoin v0.2.4 h1:Ugdm7cg7i6ZK6x3xDF1oEu1nfkyfH53EtKeQYTC3kyg=
|
||||
github.com/cyphar/filepath-securejoin v0.2.4/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxGGx79pTxQpKOJNYHHl4=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/dlclark/regexp2 v1.4.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
|
||||
github.com/dlclark/regexp2 v1.10.0 h1:+/GIL799phkJqYW+3YbOd8LCcbHzT0Pbo8zl70MHsq0=
|
||||
github.com/dlclark/regexp2 v1.10.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815 h1:bWDMxwH3px2JBh6AyO7hdCn/PkvCZXii8TGj7sbtEbQ=
|
||||
@@ -37,6 +38,7 @@ github.com/go-git/go-git/v5 v5.11.0/go.mod h1:6GFcX2P3NM7FPBfpePbpLd21XxsgdAt+lK
|
||||
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da h1:oI5xCqsCo564l8iNU+DwB5epxmsaqB+rhGL0m5jtYqE=
|
||||
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=
|
||||
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
|
||||
github.com/hexops/gotextdiff v1.0.3 h1:gitA9+qJrrTCsiCl7+kh75nPqQt1cx4ZkudSTLoUqJM=
|
||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=
|
||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=
|
||||
github.com/kevinburke/ssh_config v1.2.0 h1:x584FjTGwHzMwvHx18PXxbBVzfnxogHaAReU4gf13a4=
|
||||
@@ -66,7 +68,6 @@ github.com/skeema/knownhosts v1.2.1/go.mod h1:xYbVRSPxqBZFrdmDyMmsOs+uX1UZC3nTN3
|
||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
|
||||
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
|
||||
github.com/stretchr/testify v1.8.4 h1:CcVxjf3Q8PM0mHUKJCdn+eZZtm5yQwehR5yeSVQQcUk=
|
||||
github.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=
|
||||
github.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=
|
||||
@@ -76,8 +77,8 @@ golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5y
|
||||
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
|
||||
golang.org/x/crypto v0.3.1-0.20221117191849-2c476679df9a/go.mod h1:hebNnKkNXi2UzZN1eVRvBB7co0a+JxK6XbPiWVs/3J4=
|
||||
golang.org/x/crypto v0.7.0/go.mod h1:pYwdfH91IfpZVANVyUOhSIPZaFoJGxTFbZhFTx+dXZU=
|
||||
golang.org/x/crypto v0.16.0 h1:mMMrFzRSCF0GvB7Ne27XVtVAaXLrPmgPC7/v0tkwHaY=
|
||||
golang.org/x/crypto v0.16.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4=
|
||||
golang.org/x/crypto v0.17.0 h1:r8bRNjWL3GshPW3gkd+RpvzWrZAwPS49OmTGZ/uhM4k=
|
||||
golang.org/x/crypto v0.17.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4=
|
||||
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
|
||||
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
|
||||
golang.org/x/mod v0.14.0 h1:dGoOF9QVLYng8IHTm7BAyWqCqSheQ5pYWGhzW00YJr0=
|
||||
@@ -137,8 +138,6 @@ gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntN
|
||||
gopkg.in/warnings.v0 v0.1.2 h1:wFXVbFY8DY5/xOe1ECiWdKCzZlxgshcYVNkBHstARME=
|
||||
gopkg.in/warnings.v0 v0.1.2/go.mod h1:jksf8JmL6Qr/oQM2OXTHunEvvTAsrWBLb6OOjuVWRNI=
|
||||
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
|
||||
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
|
||||
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
|
||||
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
|
||||
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
|
||||
@@ -2,6 +2,8 @@
|
||||
// management.
|
||||
package cheatpath
|
||||
|
||||
import "fmt"
|
||||
|
||||
// Cheatpath encapsulates cheatsheet path information
|
||||
type Cheatpath struct {
|
||||
Name string `yaml:"name"`
|
||||
@@ -9,3 +11,18 @@ type Cheatpath struct {
|
||||
ReadOnly bool `yaml:"readonly"`
|
||||
Tags []string `yaml:"tags"`
|
||||
}
|
||||
|
||||
// Validate ensures that the Cheatpath is valid
|
||||
func (c Cheatpath) Validate() error {
|
||||
// Check that name is not empty
|
||||
if c.Name == "" {
|
||||
return fmt.Errorf("cheatpath name cannot be empty")
|
||||
}
|
||||
|
||||
// Check that path is not empty
|
||||
if c.Path == "" {
|
||||
return fmt.Errorf("cheatpath path cannot be empty")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
113
internal/cheatpath/cheatpath_test.go
Normal file
113
internal/cheatpath/cheatpath_test.go
Normal file
@@ -0,0 +1,113 @@
|
||||
package cheatpath
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestCheatpathValidate(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
cheatpath Cheatpath
|
||||
wantErr bool
|
||||
errMsg string
|
||||
}{
|
||||
{
|
||||
name: "valid cheatpath",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "personal",
|
||||
Path: "/home/user/.config/cheat/personal",
|
||||
ReadOnly: false,
|
||||
Tags: []string{"personal"},
|
||||
},
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "empty name",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "",
|
||||
Path: "/home/user/.config/cheat/personal",
|
||||
ReadOnly: false,
|
||||
Tags: []string{"personal"},
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "cheatpath name cannot be empty",
|
||||
},
|
||||
{
|
||||
name: "empty path",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "personal",
|
||||
Path: "",
|
||||
ReadOnly: false,
|
||||
Tags: []string{"personal"},
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "cheatpath path cannot be empty",
|
||||
},
|
||||
{
|
||||
name: "both empty",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "",
|
||||
Path: "",
|
||||
ReadOnly: true,
|
||||
Tags: nil,
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "cheatpath name cannot be empty",
|
||||
},
|
||||
{
|
||||
name: "minimal valid",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "x",
|
||||
Path: "/",
|
||||
},
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "with readonly and tags",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "community",
|
||||
Path: "/usr/share/cheat",
|
||||
ReadOnly: true,
|
||||
Tags: []string{"community", "shared", "readonly"},
|
||||
},
|
||||
wantErr: false,
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
err := tt.cheatpath.Validate()
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Validate() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if err != nil && tt.errMsg != "" && !strings.Contains(err.Error(), tt.errMsg) {
|
||||
t.Errorf("Validate() error = %v, want error containing %q", err, tt.errMsg)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheatpathStruct(t *testing.T) {
|
||||
// Test that the struct fields work as expected
|
||||
cp := Cheatpath{
|
||||
Name: "test",
|
||||
Path: "/test/path",
|
||||
ReadOnly: true,
|
||||
Tags: []string{"tag1", "tag2"},
|
||||
}
|
||||
|
||||
if cp.Name != "test" {
|
||||
t.Errorf("expected Name to be 'test', got %q", cp.Name)
|
||||
}
|
||||
if cp.Path != "/test/path" {
|
||||
t.Errorf("expected Path to be '/test/path', got %q", cp.Path)
|
||||
}
|
||||
if !cp.ReadOnly {
|
||||
t.Error("expected ReadOnly to be true")
|
||||
}
|
||||
if len(cp.Tags) != 2 || cp.Tags[0] != "tag1" || cp.Tags[1] != "tag2" {
|
||||
t.Errorf("expected Tags to be [tag1 tag2], got %v", cp.Tags)
|
||||
}
|
||||
}
|
||||
63
internal/cheatpath/doc.go
Normal file
63
internal/cheatpath/doc.go
Normal file
@@ -0,0 +1,63 @@
|
||||
// Package cheatpath manages collections of cheat sheets organized in filesystem directories.
|
||||
//
|
||||
// A Cheatpath represents a directory containing cheat sheets, with associated
|
||||
// metadata such as tags and read-only status. Multiple cheatpaths can be
|
||||
// configured to organize sheets from different sources (personal, community, work, etc.).
|
||||
//
|
||||
// # Cheatpath Structure
|
||||
//
|
||||
// Each cheatpath has:
|
||||
// - Name: A friendly identifier (e.g., "personal", "community")
|
||||
// - Path: The filesystem path to the directory
|
||||
// - Tags: Tags automatically applied to all sheets in this path
|
||||
// - ReadOnly: Whether sheets in this path can be modified
|
||||
//
|
||||
// Example configuration:
|
||||
//
|
||||
// cheatpaths:
|
||||
// - name: personal
|
||||
// path: ~/cheat
|
||||
// tags: []
|
||||
// readonly: false
|
||||
// - name: community
|
||||
// path: ~/cheat/community
|
||||
// tags: [community]
|
||||
// readonly: true
|
||||
//
|
||||
// # Directory-Scoped Cheatpaths
|
||||
//
|
||||
// The package supports directory-scoped cheatpaths via `.cheat` directories.
|
||||
// When running cheat from a directory containing a `.cheat` subdirectory,
|
||||
// that directory is temporarily added to the available cheatpaths.
|
||||
//
|
||||
// # Precedence and Overrides
|
||||
//
|
||||
// When multiple cheatpaths contain a sheet with the same name, the sheet
|
||||
// from the most "local" cheatpath takes precedence. This allows users to
|
||||
// override community sheets with personal versions.
|
||||
//
|
||||
// Key Functions
|
||||
//
|
||||
// - Filter: Filters cheatpaths by name
|
||||
// - Validate: Ensures cheatpath configuration is valid
|
||||
// - Writeable: Returns the first writeable cheatpath
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Filter cheatpaths to only "personal"
|
||||
// filtered, err := cheatpath.Filter(paths, "personal")
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Find a writeable cheatpath
|
||||
// writeable, err := cheatpath.Writeable(paths)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Validate cheatpath configuration
|
||||
// if err := cheatpath.Validate(paths); err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
package cheatpath
|
||||
@@ -2,16 +2,38 @@ package cheatpath
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Validate returns an error if the cheatpath is invalid
|
||||
func (c *Cheatpath) Validate() error {
|
||||
|
||||
if c.Name == "" {
|
||||
return fmt.Errorf("invalid cheatpath: name must be specified")
|
||||
// ValidateSheetName ensures that a cheatsheet name does not contain
|
||||
// directory traversal sequences or other potentially dangerous patterns.
|
||||
func ValidateSheetName(name string) error {
|
||||
// Reject empty names
|
||||
if name == "" {
|
||||
return fmt.Errorf("cheatsheet name cannot be empty")
|
||||
}
|
||||
if c.Path == "" {
|
||||
return fmt.Errorf("invalid cheatpath: path must be specified")
|
||||
|
||||
// Reject names containing directory traversal
|
||||
if strings.Contains(name, "..") {
|
||||
return fmt.Errorf("cheatsheet name cannot contain '..'")
|
||||
}
|
||||
|
||||
// Reject absolute paths
|
||||
if filepath.IsAbs(name) {
|
||||
return fmt.Errorf("cheatsheet name cannot be an absolute path")
|
||||
}
|
||||
|
||||
// Reject names that start with ~ (home directory expansion)
|
||||
if strings.HasPrefix(name, "~") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '~'")
|
||||
}
|
||||
|
||||
// Reject hidden files (files that start with a dot)
|
||||
// We don't display hidden files, so we shouldn't create them
|
||||
filename := filepath.Base(name)
|
||||
if strings.HasPrefix(filename, ".") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '.' (hidden files are not supported)")
|
||||
}
|
||||
|
||||
return nil
|
||||
|
||||
169
internal/cheatpath/validate_fuzz_test.go
Normal file
169
internal/cheatpath/validate_fuzz_test.go
Normal file
@@ -0,0 +1,169 @@
|
||||
package cheatpath
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
// FuzzValidateSheetName tests the ValidateSheetName function with fuzzing
|
||||
// to ensure it properly prevents path traversal and other security issues
|
||||
func FuzzValidateSheetName(f *testing.F) {
|
||||
// Add seed corpus with various valid and malicious inputs
|
||||
// Valid names
|
||||
f.Add("docker")
|
||||
f.Add("docker/compose")
|
||||
f.Add("lang/go/slice")
|
||||
f.Add("my-cheat_sheet")
|
||||
f.Add("file.txt")
|
||||
f.Add("a")
|
||||
f.Add("123")
|
||||
|
||||
// Path traversal attempts
|
||||
f.Add("..")
|
||||
f.Add("../etc/passwd")
|
||||
f.Add("foo/../bar")
|
||||
f.Add("foo/../../etc/passwd")
|
||||
f.Add("..\\windows\\system32")
|
||||
f.Add("foo\\..\\..\\windows")
|
||||
|
||||
// Encoded traversal attempts
|
||||
f.Add("%2e%2e")
|
||||
f.Add("%2e%2e%2f")
|
||||
f.Add("..%2f")
|
||||
f.Add("%2e.")
|
||||
f.Add(".%2e")
|
||||
f.Add("\x2e\x2e")
|
||||
f.Add("\\x2e\\x2e")
|
||||
|
||||
// Unicode and special characters
|
||||
f.Add("€test")
|
||||
f.Add("test€")
|
||||
f.Add("中文")
|
||||
f.Add("🎉emoji")
|
||||
f.Add("\x00null")
|
||||
f.Add("test\x00null")
|
||||
f.Add("\nnewline")
|
||||
f.Add("test\ttab")
|
||||
|
||||
// Absolute paths
|
||||
f.Add("/etc/passwd")
|
||||
f.Add("C:\\Windows\\System32")
|
||||
f.Add("\\\\server\\share")
|
||||
f.Add("//server/share")
|
||||
|
||||
// Home directory
|
||||
f.Add("~")
|
||||
f.Add("~/config")
|
||||
f.Add("~user/file")
|
||||
|
||||
// Hidden files
|
||||
f.Add(".hidden")
|
||||
f.Add("dir/.hidden")
|
||||
f.Add(".git/config")
|
||||
|
||||
// Edge cases
|
||||
f.Add("")
|
||||
f.Add(" ")
|
||||
f.Add(" ")
|
||||
f.Add("\t")
|
||||
f.Add(".")
|
||||
f.Add("./")
|
||||
f.Add("./file")
|
||||
f.Add(".../")
|
||||
f.Add("...")
|
||||
f.Add("....")
|
||||
|
||||
// Very long names
|
||||
f.Add(strings.Repeat("a", 255))
|
||||
f.Add(strings.Repeat("a/", 100) + "file")
|
||||
f.Add(strings.Repeat("../", 50) + "etc/passwd")
|
||||
|
||||
f.Fuzz(func(t *testing.T, input string) {
|
||||
// The function should never panic
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("ValidateSheetName panicked with input %q: %v", input, r)
|
||||
}
|
||||
}()
|
||||
|
||||
err := ValidateSheetName(input)
|
||||
|
||||
// Security invariants that must always hold
|
||||
if err == nil {
|
||||
// If validation passed, verify security properties
|
||||
|
||||
// Should not contain ".." for path traversal
|
||||
if strings.Contains(input, "..") {
|
||||
t.Errorf("validation passed but input contains '..': %q", input)
|
||||
}
|
||||
|
||||
// Should not be empty
|
||||
if input == "" {
|
||||
t.Error("validation passed for empty input")
|
||||
}
|
||||
|
||||
// Should not start with ~ (home directory)
|
||||
if strings.HasPrefix(input, "~") {
|
||||
t.Errorf("validation passed but input starts with '~': %q", input)
|
||||
}
|
||||
|
||||
// Base filename should not start with .
|
||||
parts := strings.Split(input, "/")
|
||||
if len(parts) > 0 {
|
||||
lastPart := parts[len(parts)-1]
|
||||
if strings.HasPrefix(lastPart, ".") && lastPart != "." {
|
||||
t.Errorf("validation passed but filename starts with '.': %q", input)
|
||||
}
|
||||
}
|
||||
|
||||
// Additional check: result should be valid UTF-8
|
||||
if !utf8.ValidString(input) {
|
||||
// While the function doesn't explicitly check this,
|
||||
// we want to ensure it handles invalid UTF-8 gracefully
|
||||
t.Logf("validation passed for invalid UTF-8: %q", input)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzValidateSheetNamePathTraversal specifically targets path traversal bypasses
|
||||
func FuzzValidateSheetNamePathTraversal(f *testing.F) {
|
||||
// Seed corpus focusing on path traversal variations
|
||||
f.Add("..", "/", "")
|
||||
f.Add("", "..", "/")
|
||||
f.Add("a", "b", "c")
|
||||
|
||||
f.Fuzz(func(t *testing.T, prefix string, middle string, suffix string) {
|
||||
// Construct various path traversal attempts
|
||||
inputs := []string{
|
||||
prefix + ".." + suffix,
|
||||
prefix + "/.." + suffix,
|
||||
prefix + "\\.." + suffix,
|
||||
prefix + middle + ".." + suffix,
|
||||
prefix + "../" + middle + suffix,
|
||||
prefix + "..%2f" + suffix,
|
||||
prefix + "%2e%2e" + suffix,
|
||||
prefix + "%2e%2e%2f" + suffix,
|
||||
}
|
||||
|
||||
for _, input := range inputs {
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("ValidateSheetName panicked with constructed input %q: %v", input, r)
|
||||
}
|
||||
}()
|
||||
|
||||
err := ValidateSheetName(input)
|
||||
|
||||
// If the input contains literal "..", it must be rejected
|
||||
if strings.Contains(input, "..") && err == nil {
|
||||
t.Errorf("validation incorrectly passed for input containing '..': %q", input)
|
||||
}
|
||||
}()
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -1,56 +1,106 @@
|
||||
package cheatpath
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestValidateValid asserts that valid cheatpaths validate successfully
|
||||
func TestValidateValid(t *testing.T) {
|
||||
|
||||
// initialize a valid cheatpath
|
||||
cheatpath := Cheatpath{
|
||||
Name: "foo",
|
||||
Path: "/foo",
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
func TestValidateSheetName(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
input string
|
||||
wantErr bool
|
||||
errMsg string
|
||||
}{
|
||||
// Valid names
|
||||
{
|
||||
name: "simple name",
|
||||
input: "docker",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "name with slash",
|
||||
input: "docker/compose",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "name with multiple slashes",
|
||||
input: "lang/go/slice",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "name with dash and underscore",
|
||||
input: "my-cheat_sheet",
|
||||
wantErr: false,
|
||||
},
|
||||
// Invalid names
|
||||
{
|
||||
name: "empty name",
|
||||
input: "",
|
||||
wantErr: true,
|
||||
errMsg: "empty",
|
||||
},
|
||||
{
|
||||
name: "parent directory traversal",
|
||||
input: "../etc/passwd",
|
||||
wantErr: true,
|
||||
errMsg: "'..'",
|
||||
},
|
||||
{
|
||||
name: "complex traversal",
|
||||
input: "foo/../../etc/passwd",
|
||||
wantErr: true,
|
||||
errMsg: "'..'",
|
||||
},
|
||||
{
|
||||
name: "absolute path",
|
||||
input: "/etc/passwd",
|
||||
wantErr: true,
|
||||
errMsg: "absolute",
|
||||
},
|
||||
{
|
||||
name: "home directory",
|
||||
input: "~/secrets",
|
||||
wantErr: true,
|
||||
errMsg: "'~'",
|
||||
},
|
||||
{
|
||||
name: "just dots",
|
||||
input: "..",
|
||||
wantErr: true,
|
||||
errMsg: "'..'",
|
||||
},
|
||||
{
|
||||
name: "hidden file not allowed",
|
||||
input: ".hidden",
|
||||
wantErr: true,
|
||||
errMsg: "cannot start with '.'",
|
||||
},
|
||||
{
|
||||
name: "current dir is ok",
|
||||
input: "./current",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "nested hidden file not allowed",
|
||||
input: "config/.gitignore",
|
||||
wantErr: true,
|
||||
errMsg: "cannot start with '.'",
|
||||
},
|
||||
}
|
||||
|
||||
// assert that no errors are returned
|
||||
if err := cheatpath.Validate(); err != nil {
|
||||
t.Errorf("failed to validate valid cheatpath: %v", err)
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
err := ValidateSheetName(tt.input)
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("ValidateName(%q) error = %v, wantErr %v", tt.input, err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if err != nil && tt.errMsg != "" {
|
||||
if !strings.Contains(err.Error(), tt.errMsg) {
|
||||
t.Errorf("ValidateName(%q) error = %v, want error containing %q", tt.input, err, tt.errMsg)
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidateMissingName asserts that paths that are missing a name fail to
|
||||
// validate
|
||||
func TestValidateMissingName(t *testing.T) {
|
||||
|
||||
// initialize a valid cheatpath
|
||||
cheatpath := Cheatpath{
|
||||
Path: "/foo",
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
}
|
||||
|
||||
// assert that no errors are returned
|
||||
if err := cheatpath.Validate(); err == nil {
|
||||
t.Errorf("failed to invalidate cheatpath without name")
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidateMissingPath asserts that paths that are missing a path fail to
|
||||
// validate
|
||||
func TestValidateMissingPath(t *testing.T) {
|
||||
|
||||
// initialize a valid cheatpath
|
||||
cheatpath := Cheatpath{
|
||||
Name: "foo",
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
}
|
||||
|
||||
// assert that no errors are returned
|
||||
if err := cheatpath.Validate(); err == nil {
|
||||
t.Errorf("failed to invalidate cheatpath without path")
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,7 +10,7 @@ import (
|
||||
cp "github.com/cheat/cheat/internal/cheatpath"
|
||||
|
||||
"github.com/mitchellh/go-homedir"
|
||||
"gopkg.in/yaml.v2"
|
||||
"gopkg.in/yaml.v3"
|
||||
)
|
||||
|
||||
// Config encapsulates configuration parameters
|
||||
@@ -40,7 +40,7 @@ func New(_ map[string]interface{}, confPath string, resolve bool) (Config, error
|
||||
conf.Path = confPath
|
||||
|
||||
// unmarshal the yaml
|
||||
err = yaml.UnmarshalStrict(buf, &conf)
|
||||
err = yaml.Unmarshal(buf, &conf)
|
||||
if err != nil {
|
||||
return Config{}, fmt.Errorf("could not unmarshal yaml: %v", err)
|
||||
}
|
||||
@@ -96,6 +96,9 @@ func New(_ map[string]interface{}, confPath string, resolve bool) (Config, error
|
||||
conf.Cheatpaths[i].Path = expanded
|
||||
}
|
||||
|
||||
// trim editor whitespace
|
||||
conf.Editor = strings.TrimSpace(conf.Editor)
|
||||
|
||||
// if an editor was not provided in the configs, attempt to choose one
|
||||
// that's appropriate for the environment
|
||||
if conf.Editor == "" {
|
||||
|
||||
247
internal/config/config_extended_test.go
Normal file
247
internal/config/config_extended_test.go
Normal file
@@ -0,0 +1,247 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/mock"
|
||||
)
|
||||
|
||||
// TestConfigYAMLErrors tests YAML parsing errors
|
||||
func TestConfigYAMLErrors(t *testing.T) {
|
||||
// Create a temporary file with invalid YAML
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
invalidYAML := filepath.Join(tempDir, "invalid.yml")
|
||||
err = os.WriteFile(invalidYAML, []byte("invalid: yaml: content:\n - no closing"), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write invalid yaml: %v", err)
|
||||
}
|
||||
|
||||
// Attempt to load invalid YAML
|
||||
_, err = New(map[string]interface{}{}, invalidYAML, false)
|
||||
if err == nil {
|
||||
t.Error("expected error for invalid YAML, got nil")
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigLocalCheatpath tests local .cheat directory detection
|
||||
func TestConfigLocalCheatpath(t *testing.T) {
|
||||
// Create a temporary directory to act as working directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Save current working directory
|
||||
oldCwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
t.Fatalf("failed to get cwd: %v", err)
|
||||
}
|
||||
defer os.Chdir(oldCwd)
|
||||
|
||||
// Change to temp directory
|
||||
err = os.Chdir(tempDir)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to change dir: %v", err)
|
||||
}
|
||||
|
||||
// Create .cheat directory
|
||||
localCheat := filepath.Join(tempDir, ".cheat")
|
||||
err = os.Mkdir(localCheat, 0755)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create .cheat dir: %v", err)
|
||||
}
|
||||
|
||||
// Load config
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Check that local cheatpath was added
|
||||
found := false
|
||||
for _, cp := range conf.Cheatpaths {
|
||||
if cp.Name == "cwd" && cp.Path == localCheat {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
t.Error("local .cheat directory was not added to cheatpaths")
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigDefaults tests default values
|
||||
func TestConfigDefaults(t *testing.T) {
|
||||
// Load empty config
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Check defaults
|
||||
if conf.Style != "bw" {
|
||||
t.Errorf("expected default style 'bw', got %s", conf.Style)
|
||||
}
|
||||
|
||||
if conf.Formatter != "terminal" {
|
||||
t.Errorf("expected default formatter 'terminal', got %s", conf.Formatter)
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigSymlinkResolution tests symlink resolution
|
||||
func TestConfigSymlinkResolution(t *testing.T) {
|
||||
// Create temp directory structure
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create target directory
|
||||
targetDir := filepath.Join(tempDir, "target")
|
||||
err = os.Mkdir(targetDir, 0755)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create target dir: %v", err)
|
||||
}
|
||||
|
||||
// Create symlink
|
||||
linkPath := filepath.Join(tempDir, "link")
|
||||
err = os.Symlink(targetDir, linkPath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create symlink: %v", err)
|
||||
}
|
||||
|
||||
// Create config with symlink path
|
||||
configContent := `---
|
||||
editor: vim
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: ` + linkPath + `
|
||||
readonly: true
|
||||
`
|
||||
configFile := filepath.Join(tempDir, "config.yml")
|
||||
err = os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Load config with symlink resolution
|
||||
conf, err := New(map[string]interface{}{}, configFile, true)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Verify symlink was resolved
|
||||
if len(conf.Cheatpaths) > 0 && conf.Cheatpaths[0].Path != targetDir {
|
||||
t.Errorf("expected symlink to be resolved to %s, got %s", targetDir, conf.Cheatpaths[0].Path)
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigBrokenSymlink tests broken symlink handling
|
||||
func TestConfigBrokenSymlink(t *testing.T) {
|
||||
// Create temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create broken symlink
|
||||
linkPath := filepath.Join(tempDir, "broken-link")
|
||||
err = os.Symlink("/nonexistent/path", linkPath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create symlink: %v", err)
|
||||
}
|
||||
|
||||
// Create config with broken symlink
|
||||
configContent := `---
|
||||
editor: vim
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: ` + linkPath + `
|
||||
readonly: true
|
||||
`
|
||||
configFile := filepath.Join(tempDir, "config.yml")
|
||||
err = os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Load config with symlink resolution should fail
|
||||
_, err = New(map[string]interface{}{}, configFile, true)
|
||||
if err == nil {
|
||||
t.Error("expected error for broken symlink, got nil")
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigTildeExpansionError tests tilde expansion error handling
|
||||
func TestConfigTildeExpansionError(t *testing.T) {
|
||||
// This is tricky to test without mocking homedir.Expand
|
||||
// We'll create a config with an invalid home reference
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create config with user that likely doesn't exist
|
||||
configContent := `---
|
||||
editor: vim
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: ~nonexistentuser12345/cheat
|
||||
readonly: true
|
||||
`
|
||||
configFile := filepath.Join(tempDir, "config.yml")
|
||||
err = os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Load config - this may or may not fail depending on the system
|
||||
// but we're testing that it doesn't panic
|
||||
_, _ = New(map[string]interface{}{}, configFile, false)
|
||||
}
|
||||
|
||||
// TestConfigGetCwdError tests error handling when os.Getwd fails
|
||||
func TestConfigGetCwdError(t *testing.T) {
|
||||
// This is difficult to test without being able to break os.Getwd
|
||||
// We'll create a scenario where the current directory is removed
|
||||
|
||||
// Create and enter a temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
|
||||
oldCwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
t.Fatalf("failed to get cwd: %v", err)
|
||||
}
|
||||
defer os.Chdir(oldCwd)
|
||||
|
||||
err = os.Chdir(tempDir)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to change dir: %v", err)
|
||||
}
|
||||
|
||||
// Remove the directory we're in
|
||||
err = os.RemoveAll(tempDir)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to remove temp dir: %v", err)
|
||||
}
|
||||
|
||||
// Now os.Getwd should fail
|
||||
_, err = New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
// This might not fail on all systems, so we just ensure no panic
|
||||
_ = err
|
||||
}
|
||||
52
internal/config/doc.go
Normal file
52
internal/config/doc.go
Normal file
@@ -0,0 +1,52 @@
|
||||
// Package config manages application configuration and settings.
|
||||
//
|
||||
// The config package provides functionality to:
|
||||
// - Load configuration from YAML files
|
||||
// - Validate configuration values
|
||||
// - Manage platform-specific configuration paths
|
||||
// - Handle editor and pager settings
|
||||
// - Configure colorization and formatting options
|
||||
//
|
||||
// # Configuration Structure
|
||||
//
|
||||
// The main configuration file (conf.yml) contains:
|
||||
// - Editor preferences
|
||||
// - Pager settings
|
||||
// - Colorization options
|
||||
// - Cheatpath definitions
|
||||
// - Formatting preferences
|
||||
//
|
||||
// Example configuration:
|
||||
//
|
||||
// ---
|
||||
// editor: vim
|
||||
// colorize: true
|
||||
// style: monokai
|
||||
// formatter: terminal256
|
||||
// pager: less -FRX
|
||||
// cheatpaths:
|
||||
// - name: personal
|
||||
// path: ~/cheat
|
||||
// tags: []
|
||||
// readonly: false
|
||||
// - name: community
|
||||
// path: ~/cheat/.cheat
|
||||
// tags: [community]
|
||||
// readonly: true
|
||||
//
|
||||
// # Platform-Specific Paths
|
||||
//
|
||||
// The package automatically detects configuration paths based on the operating system:
|
||||
// - Linux/Unix: $XDG_CONFIG_HOME/cheat/conf.yml or ~/.config/cheat/conf.yml
|
||||
// - macOS: ~/Library/Application Support/cheat/conf.yml
|
||||
// - Windows: %APPDATA%\cheat\conf.yml
|
||||
//
|
||||
// # Environment Variables
|
||||
//
|
||||
// The following environment variables are respected:
|
||||
// - CHEAT_CONFIG_PATH: Override the configuration file location
|
||||
// - CHEAT_USE_FZF: Enable fzf integration when set to "true"
|
||||
// - EDITOR: Default editor if not specified in config
|
||||
// - VISUAL: Fallback editor if EDITOR is not set
|
||||
// - PAGER: Default pager if not specified in config
|
||||
package config
|
||||
95
internal/config/editor_test.go
Normal file
95
internal/config/editor_test.go
Normal file
@@ -0,0 +1,95 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"runtime"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestEditor tests the Editor function
|
||||
func TestEditor(t *testing.T) {
|
||||
// Save original env vars
|
||||
oldVisual := os.Getenv("VISUAL")
|
||||
oldEditor := os.Getenv("EDITOR")
|
||||
defer func() {
|
||||
os.Setenv("VISUAL", oldVisual)
|
||||
os.Setenv("EDITOR", oldEditor)
|
||||
}()
|
||||
|
||||
t.Run("windows default", func(t *testing.T) {
|
||||
if runtime.GOOS != "windows" {
|
||||
t.Skip("skipping windows test on non-windows platform")
|
||||
}
|
||||
|
||||
// Clear env vars
|
||||
os.Setenv("VISUAL", "")
|
||||
os.Setenv("EDITOR", "")
|
||||
|
||||
editor, err := Editor()
|
||||
if err != nil {
|
||||
t.Errorf("unexpected error: %v", err)
|
||||
}
|
||||
if editor != "notepad" {
|
||||
t.Errorf("expected 'notepad' on windows, got %s", editor)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("VISUAL takes precedence", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("VISUAL", "emacs")
|
||||
os.Setenv("EDITOR", "nano")
|
||||
|
||||
editor, err := Editor()
|
||||
if err != nil {
|
||||
t.Errorf("unexpected error: %v", err)
|
||||
}
|
||||
if editor != "emacs" {
|
||||
t.Errorf("expected VISUAL to take precedence, got %s", editor)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("EDITOR when no VISUAL", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("VISUAL", "")
|
||||
os.Setenv("EDITOR", "vim")
|
||||
|
||||
editor, err := Editor()
|
||||
if err != nil {
|
||||
t.Errorf("unexpected error: %v", err)
|
||||
}
|
||||
if editor != "vim" {
|
||||
t.Errorf("expected EDITOR value, got %s", editor)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("no editor found error", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
// Clear all environment variables
|
||||
os.Setenv("VISUAL", "")
|
||||
os.Setenv("EDITOR", "")
|
||||
|
||||
// Create a custom PATH that doesn't include common editors
|
||||
oldPath := os.Getenv("PATH")
|
||||
defer os.Setenv("PATH", oldPath)
|
||||
|
||||
// Set a very limited PATH that won't have editors
|
||||
os.Setenv("PATH", "/nonexistent")
|
||||
|
||||
editor, err := Editor()
|
||||
|
||||
// If we found an editor, it's likely in the system
|
||||
// This test might not always produce an error on systems with editors
|
||||
if editor == "" && err == nil {
|
||||
t.Error("expected error when no editor found")
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -2,6 +2,8 @@ package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
@@ -35,3 +37,84 @@ func TestInit(t *testing.T) {
|
||||
t.Errorf("failed to write configs: want: %s, got: %s", conf, got)
|
||||
}
|
||||
}
|
||||
|
||||
// TestInitCreateDirectory tests that Init creates the directory if it doesn't exist
|
||||
func TestInitCreateDirectory(t *testing.T) {
|
||||
// Create a temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-init-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Path to a config file in a non-existent subdirectory
|
||||
confPath := filepath.Join(tempDir, "subdir", "conf.yml")
|
||||
|
||||
// Initialize the config file
|
||||
conf := "test config"
|
||||
if err = Init(confPath, conf); err != nil {
|
||||
t.Errorf("failed to init config file: %v", err)
|
||||
}
|
||||
|
||||
// Verify the directory was created
|
||||
if _, err := os.Stat(filepath.Dir(confPath)); os.IsNotExist(err) {
|
||||
t.Error("Init did not create the directory")
|
||||
}
|
||||
|
||||
// Verify the file was created with correct content
|
||||
bytes, err := os.ReadFile(confPath)
|
||||
if err != nil {
|
||||
t.Errorf("failed to read config file: %v", err)
|
||||
}
|
||||
if string(bytes) != conf {
|
||||
t.Errorf("config content mismatch: got %q, want %q", string(bytes), conf)
|
||||
}
|
||||
}
|
||||
|
||||
// TestInitWriteError tests error handling when file write fails
|
||||
func TestInitWriteError(t *testing.T) {
|
||||
// Skip this test if running as root (can write anywhere)
|
||||
if os.Getuid() == 0 {
|
||||
t.Skip("Cannot test write errors as root")
|
||||
}
|
||||
|
||||
// Try to write to a read-only directory
|
||||
err := Init("/dev/null/impossible/path/conf.yml", "test")
|
||||
if err == nil {
|
||||
t.Error("expected error when writing to invalid path, got nil")
|
||||
}
|
||||
if err != nil && !strings.Contains(err.Error(), "failed to create") {
|
||||
t.Errorf("expected 'failed to create' error, got: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// TestInitExistingFile tests that Init overwrites existing files
|
||||
func TestInitExistingFile(t *testing.T) {
|
||||
// Create a temp file
|
||||
tempFile, err := os.CreateTemp("", "cheat-init-existing-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
defer os.Remove(tempFile.Name())
|
||||
|
||||
// Write initial content
|
||||
initialContent := "initial content"
|
||||
if err := os.WriteFile(tempFile.Name(), []byte(initialContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write initial content: %v", err)
|
||||
}
|
||||
|
||||
// Initialize with new content
|
||||
newContent := "new config content"
|
||||
if err = Init(tempFile.Name(), newContent); err != nil {
|
||||
t.Errorf("failed to init over existing file: %v", err)
|
||||
}
|
||||
|
||||
// Verify the file was overwritten
|
||||
bytes, err := os.ReadFile(tempFile.Name())
|
||||
if err != nil {
|
||||
t.Errorf("failed to read config file: %v", err)
|
||||
}
|
||||
if string(bytes) != newContent {
|
||||
t.Errorf("config not overwritten: got %q, want %q", string(bytes), newContent)
|
||||
}
|
||||
}
|
||||
|
||||
125
internal/config/new_test.go
Normal file
125
internal/config/new_test.go
Normal file
@@ -0,0 +1,125 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestNewTrimsWhitespace(t *testing.T) {
|
||||
// Create a temporary config file with whitespace in editor and pager
|
||||
tmpDir := t.TempDir()
|
||||
configPath := filepath.Join(tmpDir, "config.yml")
|
||||
|
||||
configContent := `---
|
||||
editor: " vim -c 'set number' "
|
||||
pager: " less -R "
|
||||
style: monokai
|
||||
formatter: terminal
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: ~/cheat
|
||||
tags: []
|
||||
readonly: false
|
||||
`
|
||||
|
||||
if err := os.WriteFile(configPath, []byte(configContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write test config: %v", err)
|
||||
}
|
||||
|
||||
// Load the config
|
||||
conf, err := New(map[string]interface{}{}, configPath, false)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Verify editor is trimmed
|
||||
expectedEditor := "vim -c 'set number'"
|
||||
if conf.Editor != expectedEditor {
|
||||
t.Errorf("editor not properly trimmed: got %q, want %q", conf.Editor, expectedEditor)
|
||||
}
|
||||
|
||||
// Verify pager is trimmed
|
||||
expectedPager := "less -R"
|
||||
if conf.Pager != expectedPager {
|
||||
t.Errorf("pager not properly trimmed: got %q, want %q", conf.Pager, expectedPager)
|
||||
}
|
||||
}
|
||||
|
||||
func TestNewEmptyEditorFallback(t *testing.T) {
|
||||
// Skip if required environment variables would interfere
|
||||
oldVisual := os.Getenv("VISUAL")
|
||||
oldEditor := os.Getenv("EDITOR")
|
||||
os.Unsetenv("VISUAL")
|
||||
os.Unsetenv("EDITOR")
|
||||
defer func() {
|
||||
os.Setenv("VISUAL", oldVisual)
|
||||
os.Setenv("EDITOR", oldEditor)
|
||||
}()
|
||||
|
||||
// Create a config with whitespace-only editor
|
||||
tmpDir := t.TempDir()
|
||||
configPath := filepath.Join(tmpDir, "config.yml")
|
||||
|
||||
configContent := `---
|
||||
editor: " "
|
||||
pager: less
|
||||
style: monokai
|
||||
formatter: terminal
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: ~/cheat
|
||||
tags: []
|
||||
readonly: false
|
||||
`
|
||||
|
||||
if err := os.WriteFile(configPath, []byte(configContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write test config: %v", err)
|
||||
}
|
||||
|
||||
// Load the config
|
||||
conf, err := New(map[string]interface{}{}, configPath, false)
|
||||
if err != nil {
|
||||
// It's OK if this fails due to no editor being found
|
||||
// The important thing is it doesn't panic
|
||||
return
|
||||
}
|
||||
|
||||
// If it succeeded, editor should not be empty (fallback was used)
|
||||
if conf.Editor == "" {
|
||||
t.Error("editor should not be empty after fallback")
|
||||
}
|
||||
}
|
||||
|
||||
func TestNewWhitespaceOnlyPager(t *testing.T) {
|
||||
// Create a config with whitespace-only pager
|
||||
tmpDir := t.TempDir()
|
||||
configPath := filepath.Join(tmpDir, "config.yml")
|
||||
|
||||
configContent := `---
|
||||
editor: vim
|
||||
pager: " "
|
||||
style: monokai
|
||||
formatter: terminal
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: ~/cheat
|
||||
tags: []
|
||||
readonly: false
|
||||
`
|
||||
|
||||
if err := os.WriteFile(configPath, []byte(configContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write test config: %v", err)
|
||||
}
|
||||
|
||||
// Load the config
|
||||
conf, err := New(map[string]interface{}{}, configPath, false)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Pager should be empty after trimming
|
||||
if conf.Pager != "" {
|
||||
t.Errorf("pager should be empty after trimming whitespace: got %q", conf.Pager)
|
||||
}
|
||||
}
|
||||
@@ -22,7 +22,7 @@ func Pager() string {
|
||||
// Otherwise, search for `pager`, `less`, and `more` on the `$PATH`. If
|
||||
// none are found, return an empty pager.
|
||||
for _, pager := range []string{"pager", "less", "more"} {
|
||||
if path, err := exec.LookPath(pager); err != nil {
|
||||
if path, err := exec.LookPath(pager); err == nil {
|
||||
return path
|
||||
}
|
||||
}
|
||||
|
||||
90
internal/config/pager_test.go
Normal file
90
internal/config/pager_test.go
Normal file
@@ -0,0 +1,90 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"runtime"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestPager tests the Pager function
|
||||
func TestPager(t *testing.T) {
|
||||
// Save original env var
|
||||
oldPager := os.Getenv("PAGER")
|
||||
defer os.Setenv("PAGER", oldPager)
|
||||
|
||||
t.Run("windows default", func(t *testing.T) {
|
||||
if runtime.GOOS != "windows" {
|
||||
t.Skip("skipping windows test on non-windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "")
|
||||
pager := Pager()
|
||||
if pager != "more" {
|
||||
t.Errorf("expected 'more' on windows, got %s", pager)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("PAGER env var", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "bat")
|
||||
pager := Pager()
|
||||
if pager != "bat" {
|
||||
t.Errorf("expected PAGER env var value, got %s", pager)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("fallback to system pager", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "")
|
||||
pager := Pager()
|
||||
|
||||
// Should find one of the fallback pagers or return empty string
|
||||
validPagers := map[string]bool{
|
||||
"": true, // no pager found
|
||||
"pager": true,
|
||||
"less": true,
|
||||
"more": true,
|
||||
}
|
||||
|
||||
// Check if it's a path to one of these
|
||||
found := false
|
||||
for p := range validPagers {
|
||||
if p == "" && pager == "" {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
if p != "" && (pager == p || len(pager) >= len(p) && pager[len(pager)-len(p):] == p) {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
t.Errorf("unexpected pager value: %s", pager)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("no pager available", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "")
|
||||
|
||||
// Save and modify PATH to ensure no pagers are found
|
||||
oldPath := os.Getenv("PATH")
|
||||
defer os.Setenv("PATH", oldPath)
|
||||
os.Setenv("PATH", "/nonexistent")
|
||||
|
||||
pager := Pager()
|
||||
if pager != "" {
|
||||
t.Errorf("expected empty string when no pager found, got %s", pager)
|
||||
}
|
||||
})
|
||||
}
|
||||
45
internal/display/doc.go
Normal file
45
internal/display/doc.go
Normal file
@@ -0,0 +1,45 @@
|
||||
// Package display handles output formatting and presentation for the cheat application.
|
||||
//
|
||||
// The display package provides utilities for:
|
||||
// - Writing output to stdout or a pager
|
||||
// - Formatting text with indentation
|
||||
// - Creating faint (dimmed) text for de-emphasis
|
||||
// - Managing colored output
|
||||
//
|
||||
// # Pager Integration
|
||||
//
|
||||
// The package integrates with system pagers (less, more, etc.) to handle
|
||||
// long output. If a pager is configured and the output is to a terminal,
|
||||
// content is automatically piped through the pager.
|
||||
//
|
||||
// # Text Formatting
|
||||
//
|
||||
// Various formatting utilities are provided:
|
||||
// - Faint: Creates dimmed text using ANSI escape codes
|
||||
// - Indent: Adds consistent indentation to text blocks
|
||||
// - Write: Intelligent output that uses stdout or pager as appropriate
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Write output, using pager if configured
|
||||
// if err := display.Write(output, config); err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Create faint text for de-emphasis
|
||||
// fainted := display.Faint("(read-only)", config)
|
||||
//
|
||||
// // Indent a block of text
|
||||
// indented := display.Indent(text, " ")
|
||||
//
|
||||
// # Color Support
|
||||
//
|
||||
// The package respects the colorization settings from the config.
|
||||
// When colorization is disabled, formatting functions like Faint
|
||||
// return unmodified text.
|
||||
//
|
||||
// # Terminal Detection
|
||||
//
|
||||
// The package uses isatty to detect if output is to a terminal,
|
||||
// which affects decisions about using a pager and applying colors.
|
||||
package display
|
||||
@@ -19,6 +19,11 @@ func Write(out string, conf config.Config) {
|
||||
}
|
||||
|
||||
// otherwise, pipe output through the pager
|
||||
writeToPager(out, conf)
|
||||
}
|
||||
|
||||
// writeToPager writes output through a pager command
|
||||
func writeToPager(out string, conf config.Config) {
|
||||
parts := strings.Split(conf.Pager, " ")
|
||||
pager := parts[0]
|
||||
args := parts[1:]
|
||||
|
||||
136
internal/display/write_test.go
Normal file
136
internal/display/write_test.go
Normal file
@@ -0,0 +1,136 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"io"
|
||||
"os"
|
||||
"os/exec"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// TestWriteToPager tests the writeToPager function
|
||||
func TestWriteToPager(t *testing.T) {
|
||||
// Skip these tests in CI/CD environments where interactive commands might not work
|
||||
if os.Getenv("CI") != "" {
|
||||
t.Skip("Skipping pager tests in CI environment")
|
||||
}
|
||||
|
||||
// Note: We can't easily test os.Exit calls, so we focus on testing writeToPager
|
||||
// which contains the core logic
|
||||
|
||||
t.Run("successful pager execution", func(t *testing.T) {
|
||||
// Save original stdout
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
// Create pipe for capturing output
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdout = w
|
||||
|
||||
// Use 'cat' as a simple pager that just outputs input
|
||||
conf := config.Config{
|
||||
Pager: "cat",
|
||||
}
|
||||
|
||||
// This will call os.Exit on error, so we need to be careful
|
||||
// We're using 'cat' which should always succeed
|
||||
input := "Test output\n"
|
||||
|
||||
// Run in a goroutine to avoid blocking
|
||||
done := make(chan bool)
|
||||
go func() {
|
||||
writeToPager(input, conf)
|
||||
done <- true
|
||||
}()
|
||||
|
||||
// Wait for completion or timeout
|
||||
select {
|
||||
case <-done:
|
||||
// Success
|
||||
}
|
||||
|
||||
// Close write end and read output
|
||||
w.Close()
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, r)
|
||||
|
||||
// Verify output
|
||||
if buf.String() != input {
|
||||
t.Errorf("expected output %q, got %q", input, buf.String())
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("pager with arguments", func(t *testing.T) {
|
||||
// Save original stdout
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
// Create pipe for capturing output
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdout = w
|
||||
|
||||
// Use 'cat' with '-A' flag (shows non-printing characters)
|
||||
conf := config.Config{
|
||||
Pager: "cat -A",
|
||||
}
|
||||
|
||||
input := "Test\toutput\n"
|
||||
|
||||
// Run in a goroutine
|
||||
done := make(chan bool)
|
||||
go func() {
|
||||
writeToPager(input, conf)
|
||||
done <- true
|
||||
}()
|
||||
|
||||
// Wait for completion
|
||||
select {
|
||||
case <-done:
|
||||
// Success
|
||||
}
|
||||
|
||||
// Close write end and read output
|
||||
w.Close()
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, r)
|
||||
|
||||
// cat -A shows tabs as ^I and line endings as $
|
||||
expected := "Test^Ioutput$\n"
|
||||
if buf.String() != expected {
|
||||
t.Errorf("expected output %q, got %q", expected, buf.String())
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// TestWriteToPagerError tests error handling in writeToPager
|
||||
func TestWriteToPagerError(t *testing.T) {
|
||||
if os.Getenv("TEST_PAGER_ERROR_SUBPROCESS") == "1" {
|
||||
// This is the subprocess - run the actual test
|
||||
conf := config.Config{Pager: "/nonexistent/command"}
|
||||
writeToPager("test", conf)
|
||||
return
|
||||
}
|
||||
|
||||
// Run test in subprocess to handle os.Exit
|
||||
cmd := exec.Command(os.Args[0], "-test.run=^TestWriteToPagerError$")
|
||||
cmd.Env = append(os.Environ(), "TEST_PAGER_ERROR_SUBPROCESS=1")
|
||||
|
||||
output, err := cmd.CombinedOutput()
|
||||
|
||||
// Should exit with error
|
||||
if err == nil {
|
||||
t.Error("expected process to exit with error")
|
||||
}
|
||||
|
||||
// Should contain error message
|
||||
if !strings.Contains(string(output), "failed to write to pager") {
|
||||
t.Errorf("expected error message about pager failure, got %q", string(output))
|
||||
}
|
||||
}
|
||||
180
internal/installer/prompt_test.go
Normal file
180
internal/installer/prompt_test.go
Normal file
@@ -0,0 +1,180 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestPrompt(t *testing.T) {
|
||||
// Save original stdin/stdout
|
||||
oldStdin := os.Stdin
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
prompt string
|
||||
input string
|
||||
defaultVal bool
|
||||
want bool
|
||||
wantErr bool
|
||||
wantPrompt string
|
||||
}{
|
||||
{
|
||||
name: "answer yes",
|
||||
prompt: "Continue?",
|
||||
input: "y\n",
|
||||
defaultVal: false,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer yes with uppercase",
|
||||
prompt: "Continue?",
|
||||
input: "Y\n",
|
||||
defaultVal: false,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer yes with spaces",
|
||||
prompt: "Continue?",
|
||||
input: " y \n",
|
||||
defaultVal: false,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer no",
|
||||
prompt: "Continue?",
|
||||
input: "n\n",
|
||||
defaultVal: true,
|
||||
want: false,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer no with any text",
|
||||
prompt: "Continue?",
|
||||
input: "anything\n",
|
||||
defaultVal: true,
|
||||
want: false,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "empty answer uses default true",
|
||||
prompt: "Continue?",
|
||||
input: "\n",
|
||||
defaultVal: true,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "empty answer uses default false",
|
||||
prompt: "Continue?",
|
||||
input: "\n",
|
||||
defaultVal: false,
|
||||
want: false,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "whitespace answer uses default",
|
||||
prompt: "Continue?",
|
||||
input: " \n",
|
||||
defaultVal: true,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
// Create a pipe for stdin
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
|
||||
// Create a pipe for stdout to capture the prompt
|
||||
rOut, wOut, _ := os.Pipe()
|
||||
os.Stdout = wOut
|
||||
|
||||
// Write input to stdin
|
||||
go func() {
|
||||
defer w.Close()
|
||||
io.WriteString(w, tt.input)
|
||||
}()
|
||||
|
||||
// Call the function
|
||||
got, err := Prompt(tt.prompt, tt.defaultVal)
|
||||
|
||||
// Close stdout write end and read the prompt
|
||||
wOut.Close()
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, rOut)
|
||||
|
||||
// Check error
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Prompt() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
|
||||
// Check result
|
||||
if got != tt.want {
|
||||
t.Errorf("Prompt() = %v, want %v", got, tt.want)
|
||||
}
|
||||
|
||||
// Check that prompt was displayed correctly
|
||||
if buf.String() != tt.wantPrompt {
|
||||
t.Errorf("Prompt display = %q, want %q", buf.String(), tt.wantPrompt)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestPromptError(t *testing.T) {
|
||||
// Save original stdin
|
||||
oldStdin := os.Stdin
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
}()
|
||||
|
||||
// Create a pipe and close it immediately to simulate read error
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
r.Close()
|
||||
w.Close()
|
||||
|
||||
// This should cause a read error
|
||||
_, err := Prompt("Test?", false)
|
||||
if err == nil {
|
||||
t.Error("expected error when reading from closed stdin, got nil")
|
||||
}
|
||||
if !strings.Contains(err.Error(), "failed to parse input") {
|
||||
t.Errorf("expected 'failed to parse input' error, got: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// TestPromptIntegration provides a simple integration test
|
||||
func TestPromptIntegration(t *testing.T) {
|
||||
// This demonstrates how the prompt would be used in practice
|
||||
// It's skipped by default since it requires actual user input
|
||||
if os.Getenv("TEST_INTERACTIVE") != "1" {
|
||||
t.Skip("Skipping interactive test - set TEST_INTERACTIVE=1 to run")
|
||||
}
|
||||
|
||||
fmt.Println("\n=== Interactive Prompt Test ===")
|
||||
fmt.Println("You will be prompted to answer a question.")
|
||||
fmt.Println("Try different inputs: y, n, Y, N, empty (just press Enter)")
|
||||
|
||||
result, err := Prompt("Would you like to continue? [Y/n]", true)
|
||||
if err != nil {
|
||||
t.Fatalf("Prompt failed: %v", err)
|
||||
}
|
||||
|
||||
fmt.Printf("You answered: %v\n", result)
|
||||
}
|
||||
236
internal/installer/run_test.go
Normal file
236
internal/installer/run_test.go
Normal file
@@ -0,0 +1,236 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestRun(t *testing.T) {
|
||||
// Create a temporary directory for testing
|
||||
tempDir, err := os.MkdirTemp("", "cheat-installer-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Save original stdin/stdout
|
||||
oldStdin := os.Stdin
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
configs string
|
||||
confpath string
|
||||
userInput string
|
||||
wantErr bool
|
||||
wantInErr string
|
||||
checkFiles []string
|
||||
dontWantFiles []string
|
||||
}{
|
||||
{
|
||||
name: "user declines community cheatsheets",
|
||||
configs: `---
|
||||
editor: EDITOR_PATH
|
||||
pager: PAGER_PATH
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
`,
|
||||
confpath: filepath.Join(tempDir, "conf1", "conf.yml"),
|
||||
userInput: "n\n",
|
||||
wantErr: false,
|
||||
checkFiles: []string{"conf1/conf.yml"},
|
||||
dontWantFiles: []string{"conf1/cheatsheets/community", "conf1/cheatsheets/personal"},
|
||||
},
|
||||
{
|
||||
name: "user accepts but clone fails",
|
||||
configs: `---
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
`,
|
||||
confpath: filepath.Join(tempDir, "conf2", "conf.yml"),
|
||||
userInput: "y\n",
|
||||
wantErr: true,
|
||||
wantInErr: "failed to clone cheatsheets",
|
||||
},
|
||||
{
|
||||
name: "invalid config path",
|
||||
configs: "test",
|
||||
confpath: "/nonexistent/path/conf.yml",
|
||||
userInput: "n\n",
|
||||
wantErr: true,
|
||||
wantInErr: "failed to create config file",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
// Create stdin pipe
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
|
||||
// Create stdout pipe to suppress output
|
||||
_, wOut, _ := os.Pipe()
|
||||
os.Stdout = wOut
|
||||
|
||||
// Write user input
|
||||
go func() {
|
||||
defer w.Close()
|
||||
io.WriteString(w, tt.userInput)
|
||||
}()
|
||||
|
||||
// Run the installer
|
||||
err := Run(tt.configs, tt.confpath)
|
||||
|
||||
// Close pipes
|
||||
wOut.Close()
|
||||
|
||||
// Check error
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Run() error = %v, wantErr %v", err, tt.wantErr)
|
||||
}
|
||||
if err != nil && tt.wantInErr != "" && !strings.Contains(err.Error(), tt.wantInErr) {
|
||||
t.Errorf("Run() error = %v, want error containing %q", err, tt.wantInErr)
|
||||
}
|
||||
|
||||
// Check created files
|
||||
for _, file := range tt.checkFiles {
|
||||
path := filepath.Join(tempDir, file)
|
||||
if _, err := os.Stat(path); os.IsNotExist(err) {
|
||||
t.Errorf("expected file %s to exist, but it doesn't", path)
|
||||
}
|
||||
}
|
||||
|
||||
// Check files that shouldn't exist
|
||||
for _, file := range tt.dontWantFiles {
|
||||
path := filepath.Join(tempDir, file)
|
||||
if _, err := os.Stat(path); err == nil {
|
||||
t.Errorf("expected file %s to not exist, but it does", path)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRunPromptError(t *testing.T) {
|
||||
// Save original stdin
|
||||
oldStdin := os.Stdin
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
}()
|
||||
|
||||
// Close stdin to cause prompt error
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
r.Close()
|
||||
w.Close()
|
||||
|
||||
tempDir, _ := os.MkdirTemp("", "cheat-installer-prompt-test-*")
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
err := Run("test", filepath.Join(tempDir, "conf.yml"))
|
||||
if err == nil {
|
||||
t.Error("expected error when prompt fails, got nil")
|
||||
}
|
||||
if !strings.Contains(err.Error(), "failed to prompt") {
|
||||
t.Errorf("expected 'failed to prompt' error, got: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
func TestRunStringReplacements(t *testing.T) {
|
||||
// Test that path replacements work correctly
|
||||
configs := `---
|
||||
editor: EDITOR_PATH
|
||||
pager: PAGER_PATH
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
`
|
||||
|
||||
// Create temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-installer-replace-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
confpath := filepath.Join(tempDir, "conf.yml")
|
||||
confdir := filepath.Dir(confpath)
|
||||
|
||||
// Expected paths
|
||||
expectedCommunity := filepath.Join(confdir, "cheatsheets", "community")
|
||||
expectedPersonal := filepath.Join(confdir, "cheatsheets", "personal")
|
||||
|
||||
// Save original stdin/stdout
|
||||
oldStdin := os.Stdin
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
// Create stdin pipe with "n" answer
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
go func() {
|
||||
defer w.Close()
|
||||
io.WriteString(w, "n\n")
|
||||
}()
|
||||
|
||||
// Suppress stdout
|
||||
_, wOut, _ := os.Pipe()
|
||||
os.Stdout = wOut
|
||||
defer wOut.Close()
|
||||
|
||||
// Run installer
|
||||
err = Run(configs, confpath)
|
||||
if err != nil {
|
||||
t.Fatalf("Run() failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the created config file
|
||||
content, err := os.ReadFile(confpath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to read config file: %v", err)
|
||||
}
|
||||
|
||||
// Check replacements
|
||||
contentStr := string(content)
|
||||
if strings.Contains(contentStr, "COMMUNITY_PATH") {
|
||||
t.Error("COMMUNITY_PATH was not replaced")
|
||||
}
|
||||
if strings.Contains(contentStr, "PERSONAL_PATH") {
|
||||
t.Error("PERSONAL_PATH was not replaced")
|
||||
}
|
||||
if strings.Contains(contentStr, "EDITOR_PATH") && !strings.Contains(contentStr, fmt.Sprintf("editor: %s", "")) {
|
||||
t.Error("EDITOR_PATH was not replaced")
|
||||
}
|
||||
if strings.Contains(contentStr, "PAGER_PATH") && !strings.Contains(contentStr, fmt.Sprintf("pager: %s", "")) {
|
||||
t.Error("PAGER_PATH was not replaced")
|
||||
}
|
||||
|
||||
// Verify correct paths were used
|
||||
if !strings.Contains(contentStr, expectedCommunity) {
|
||||
t.Errorf("expected community path %q in config", expectedCommunity)
|
||||
}
|
||||
if !strings.Contains(contentStr, expectedPersonal) {
|
||||
t.Errorf("expected personal path %q in config", expectedPersonal)
|
||||
}
|
||||
}
|
||||
@@ -8,11 +8,11 @@ import (
|
||||
"github.com/go-git/go-git/v5"
|
||||
)
|
||||
|
||||
// Clone clones the repo available at `url`
|
||||
func Clone(url string) error {
|
||||
// Clone clones the community cheatsheets repository to the specified directory
|
||||
func Clone(dir string) error {
|
||||
|
||||
// clone the community cheatsheets
|
||||
_, err := git.PlainClone(url, false, &git.CloneOptions{
|
||||
_, err := git.PlainClone(dir, false, &git.CloneOptions{
|
||||
URL: "https://github.com/cheat/cheatsheets.git",
|
||||
Depth: 1,
|
||||
Progress: os.Stdout,
|
||||
|
||||
80
internal/repo/clone_integration_test.go
Normal file
80
internal/repo/clone_integration_test.go
Normal file
@@ -0,0 +1,80 @@
|
||||
//go:build integration
|
||||
// +build integration
|
||||
|
||||
package repo
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestCloneIntegration performs a real clone operation to verify functionality
|
||||
// Run with: go test -tags=integration ./internal/repo -v -run TestCloneIntegration
|
||||
func TestCloneIntegration(t *testing.T) {
|
||||
if testing.Short() {
|
||||
t.Skip("Skipping integration test in short mode")
|
||||
}
|
||||
|
||||
// Create a temporary directory
|
||||
tmpDir, err := os.MkdirTemp("", "cheat-clone-integration-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
|
||||
destDir := filepath.Join(tmpDir, "cheatsheets")
|
||||
|
||||
t.Logf("Cloning to: %s", destDir)
|
||||
|
||||
// Perform the actual clone
|
||||
err = Clone(destDir)
|
||||
if err != nil {
|
||||
t.Fatalf("Clone() failed: %v", err)
|
||||
}
|
||||
|
||||
// Verify the clone succeeded
|
||||
info, err := os.Stat(destDir)
|
||||
if err != nil {
|
||||
t.Fatalf("destination directory not created: %v", err)
|
||||
}
|
||||
|
||||
if !info.IsDir() {
|
||||
t.Fatal("destination is not a directory")
|
||||
}
|
||||
|
||||
// Check for .git directory
|
||||
gitDir := filepath.Join(destDir, ".git")
|
||||
if _, err := os.Stat(gitDir); err != nil {
|
||||
t.Error(".git directory not found")
|
||||
}
|
||||
|
||||
// Check for some expected cheatsheets
|
||||
expectedFiles := []string{
|
||||
"bash", // bash cheatsheet should exist
|
||||
"git", // git cheatsheet should exist
|
||||
"ls", // ls cheatsheet should exist
|
||||
}
|
||||
|
||||
foundCount := 0
|
||||
for _, file := range expectedFiles {
|
||||
path := filepath.Join(destDir, file)
|
||||
if _, err := os.Stat(path); err == nil {
|
||||
foundCount++
|
||||
}
|
||||
}
|
||||
|
||||
if foundCount < 2 {
|
||||
t.Errorf("expected at least 2 common cheatsheets, found %d", foundCount)
|
||||
}
|
||||
|
||||
t.Log("Clone integration test passed!")
|
||||
|
||||
// Test cloning to existing directory (should fail)
|
||||
err = Clone(destDir)
|
||||
if err == nil {
|
||||
t.Error("expected error when cloning to existing repository, got nil")
|
||||
} else {
|
||||
t.Logf("Expected error when cloning to existing dir: %v", err)
|
||||
}
|
||||
}
|
||||
49
internal/repo/clone_test.go
Normal file
49
internal/repo/clone_test.go
Normal file
@@ -0,0 +1,49 @@
|
||||
package repo
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestClone tests the Clone function
|
||||
func TestClone(t *testing.T) {
|
||||
// This test requires network access, so we'll only test error cases
|
||||
// that don't require actual cloning
|
||||
|
||||
t.Run("clone to read-only directory", func(t *testing.T) {
|
||||
if os.Getuid() == 0 {
|
||||
t.Skip("Cannot test read-only directory as root")
|
||||
}
|
||||
|
||||
// Create a temporary directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-clone-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create a read-only subdirectory
|
||||
readOnlyDir := filepath.Join(tempDir, "readonly")
|
||||
if err := os.Mkdir(readOnlyDir, 0555); err != nil {
|
||||
t.Fatalf("failed to create read-only dir: %v", err)
|
||||
}
|
||||
|
||||
// Attempt to clone to read-only directory
|
||||
targetDir := filepath.Join(readOnlyDir, "cheatsheets")
|
||||
err = Clone(targetDir)
|
||||
|
||||
// Should fail because we can't write to read-only directory
|
||||
if err == nil {
|
||||
t.Error("expected error when cloning to read-only directory, got nil")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("clone to invalid path", func(t *testing.T) {
|
||||
// Try to clone to a path with null bytes (invalid on most filesystems)
|
||||
err := Clone("/tmp/invalid\x00path")
|
||||
if err == nil {
|
||||
t.Error("expected error with invalid path, got nil")
|
||||
}
|
||||
})
|
||||
}
|
||||
177
internal/repo/gitdir_test.go
Normal file
177
internal/repo/gitdir_test.go
Normal file
@@ -0,0 +1,177 @@
|
||||
package repo
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestGitDir(t *testing.T) {
|
||||
// Create a temporary directory for testing
|
||||
tempDir, err := os.MkdirTemp("", "cheat-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create test directory structure
|
||||
testDirs := []string{
|
||||
filepath.Join(tempDir, ".git"),
|
||||
filepath.Join(tempDir, ".git", "objects"),
|
||||
filepath.Join(tempDir, ".git", "refs"),
|
||||
filepath.Join(tempDir, "regular"),
|
||||
filepath.Join(tempDir, "regular", ".git"),
|
||||
filepath.Join(tempDir, "submodule"),
|
||||
}
|
||||
|
||||
for _, dir := range testDirs {
|
||||
if err := os.MkdirAll(dir, 0755); err != nil {
|
||||
t.Fatalf("failed to create dir %s: %v", dir, err)
|
||||
}
|
||||
}
|
||||
|
||||
// Create test files
|
||||
testFiles := map[string]string{
|
||||
filepath.Join(tempDir, ".gitignore"): "*.tmp\n",
|
||||
filepath.Join(tempDir, ".gitattributes"): "* text=auto\n",
|
||||
filepath.Join(tempDir, "submodule", ".git"): "gitdir: ../.git/modules/submodule\n",
|
||||
filepath.Join(tempDir, "regular", "sheet.txt"): "content\n",
|
||||
}
|
||||
|
||||
for file, content := range testFiles {
|
||||
if err := os.WriteFile(file, []byte(content), 0644); err != nil {
|
||||
t.Fatalf("failed to create file %s: %v", file, err)
|
||||
}
|
||||
}
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
path string
|
||||
want bool
|
||||
wantErr bool
|
||||
}{
|
||||
{
|
||||
name: "not in git directory",
|
||||
path: filepath.Join(tempDir, "regular", "sheet.txt"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "in .git directory",
|
||||
path: filepath.Join(tempDir, ".git", "objects", "file"),
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
name: "in .git/refs directory",
|
||||
path: filepath.Join(tempDir, ".git", "refs", "heads", "main"),
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
name: ".gitignore file",
|
||||
path: filepath.Join(tempDir, ".gitignore"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: ".gitattributes file",
|
||||
path: filepath.Join(tempDir, ".gitattributes"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "submodule with .git file",
|
||||
path: filepath.Join(tempDir, "submodule", "sheet.txt"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "path with .git in middle",
|
||||
path: filepath.Join(tempDir, "regular", ".git", "sheet.txt"),
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
name: "nonexistent path without .git",
|
||||
path: filepath.Join(tempDir, "nonexistent", "file"),
|
||||
want: false,
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got, err := GitDir(tt.path)
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("GitDir() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if got != tt.want {
|
||||
t.Errorf("GitDir() = %v, want %v", got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestGitDirEdgeCases(t *testing.T) {
|
||||
// Test with paths that have .git but not as a directory separator
|
||||
tests := []struct {
|
||||
name string
|
||||
path string
|
||||
want bool
|
||||
}{
|
||||
{
|
||||
name: "file ending with .git",
|
||||
path: "/tmp/myfile.git",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "directory ending with .git",
|
||||
path: "/tmp/myrepo.git",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: ".github directory",
|
||||
path: "/tmp/.github/workflows",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "legitimate.git-repo name",
|
||||
path: "/tmp/legitimate.git-repo/file",
|
||||
want: false,
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got, err := GitDir(tt.path)
|
||||
if err != nil {
|
||||
// It's ok if the path doesn't exist for these edge case tests
|
||||
return
|
||||
}
|
||||
if got != tt.want {
|
||||
t.Errorf("GitDir(%q) = %v, want %v", tt.path, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestGitDirPathSeparator(t *testing.T) {
|
||||
// Test that the function correctly uses os.PathSeparator
|
||||
// This is important for cross-platform compatibility
|
||||
|
||||
// Create a path with the wrong separator for the current OS
|
||||
var wrongSep string
|
||||
if os.PathSeparator == '/' {
|
||||
wrongSep = `\`
|
||||
} else {
|
||||
wrongSep = `/`
|
||||
}
|
||||
|
||||
// Path with wrong separator should not be detected as git dir
|
||||
path := fmt.Sprintf("some%spath%s.git%sfile", wrongSep, wrongSep, wrongSep)
|
||||
isGit, err := GitDir(path)
|
||||
|
||||
if err != nil {
|
||||
// Path doesn't exist, which is fine
|
||||
return
|
||||
}
|
||||
|
||||
if isGit {
|
||||
t.Errorf("GitDir() incorrectly detected git dir with wrong path separator")
|
||||
}
|
||||
}
|
||||
@@ -5,7 +5,7 @@ import (
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
|
||||
"github.com/alecthomas/chroma/quick"
|
||||
"github.com/alecthomas/chroma/v2/quick"
|
||||
)
|
||||
|
||||
// Colorize applies syntax-highlighting to a cheatsheet's Text.
|
||||
|
||||
@@ -32,3 +32,29 @@ func TestColorize(t *testing.T) {
|
||||
t.Errorf("failed to colorize sheet: want: %s, got: %s", want, s.Text)
|
||||
}
|
||||
}
|
||||
|
||||
// TestColorizeError tests the error handling in Colorize
|
||||
func TestColorizeError(_ *testing.T) {
|
||||
// Create a sheet with content
|
||||
sheet := Sheet{
|
||||
Text: "some text",
|
||||
Syntax: "invalidlexer12345", // Use an invalid lexer that might cause issues
|
||||
}
|
||||
|
||||
// Create a config with invalid formatter/style
|
||||
conf := config.Config{
|
||||
Formatter: "invalidformatter",
|
||||
Style: "invalidstyle",
|
||||
}
|
||||
|
||||
// Store original text
|
||||
originalText := sheet.Text
|
||||
|
||||
// Colorize should not panic even with invalid settings
|
||||
sheet.Colorize(conf)
|
||||
|
||||
// The text might be unchanged if there was an error, or it might be colorized
|
||||
// We're mainly testing that it doesn't panic
|
||||
_ = sheet.Text
|
||||
_ = originalText
|
||||
}
|
||||
|
||||
@@ -39,6 +39,8 @@ func (s *Sheet) Copy(dest string) error {
|
||||
// copy file contents
|
||||
_, err = io.Copy(outfile, infile)
|
||||
if err != nil {
|
||||
// Clean up the partially written file on error
|
||||
os.Remove(dest)
|
||||
return fmt.Errorf(
|
||||
"failed to copy file: infile: %s, outfile: %s, err: %v",
|
||||
s.Path,
|
||||
|
||||
187
internal/sheet/copy_error_test.go
Normal file
187
internal/sheet/copy_error_test.go
Normal file
@@ -0,0 +1,187 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestCopyErrors tests error cases for the Copy method
|
||||
func TestCopyErrors(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
setup func() (*Sheet, string, func())
|
||||
wantErr bool
|
||||
errMsg string
|
||||
}{
|
||||
{
|
||||
name: "source file does not exist",
|
||||
setup: func() (*Sheet, string, func()) {
|
||||
// Create a sheet with non-existent path
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: "/non/existent/file.txt",
|
||||
CheatPath: "test",
|
||||
}
|
||||
dest := filepath.Join(os.TempDir(), "copy-test-dest.txt")
|
||||
cleanup := func() {
|
||||
os.Remove(dest)
|
||||
}
|
||||
return sheet, dest, cleanup
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "failed to open cheatsheet",
|
||||
},
|
||||
{
|
||||
name: "destination directory creation fails",
|
||||
setup: func() (*Sheet, string, func()) {
|
||||
// Create a source file
|
||||
src, err := os.CreateTemp("", "copy-test-src-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
src.WriteString("test content")
|
||||
src.Close()
|
||||
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: src.Name(),
|
||||
CheatPath: "test",
|
||||
}
|
||||
|
||||
// Create a file where we want a directory
|
||||
blockerFile := filepath.Join(os.TempDir(), "copy-blocker-file")
|
||||
if err := os.WriteFile(blockerFile, []byte("blocker"), 0644); err != nil {
|
||||
t.Fatalf("failed to create blocker file: %v", err)
|
||||
}
|
||||
|
||||
// Try to create dest under the blocker file (will fail)
|
||||
dest := filepath.Join(blockerFile, "subdir", "dest.txt")
|
||||
|
||||
cleanup := func() {
|
||||
os.Remove(src.Name())
|
||||
os.Remove(blockerFile)
|
||||
}
|
||||
return sheet, dest, cleanup
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "failed to create directory",
|
||||
},
|
||||
{
|
||||
name: "destination file creation fails",
|
||||
setup: func() (*Sheet, string, func()) {
|
||||
// Create a source file
|
||||
src, err := os.CreateTemp("", "copy-test-src-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
src.WriteString("test content")
|
||||
src.Close()
|
||||
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: src.Name(),
|
||||
CheatPath: "test",
|
||||
}
|
||||
|
||||
// Create a directory where we want the file
|
||||
destDir := filepath.Join(os.TempDir(), "copy-test-dir")
|
||||
if err := os.Mkdir(destDir, 0755); err != nil && !os.IsExist(err) {
|
||||
t.Fatalf("failed to create dest dir: %v", err)
|
||||
}
|
||||
|
||||
cleanup := func() {
|
||||
os.Remove(src.Name())
|
||||
os.RemoveAll(destDir)
|
||||
}
|
||||
return sheet, destDir, cleanup
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "failed to create outfile",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
sheet, dest, cleanup := tt.setup()
|
||||
defer cleanup()
|
||||
|
||||
err := sheet.Copy(dest)
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Copy() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if err != nil && tt.errMsg != "" {
|
||||
if !contains(err.Error(), tt.errMsg) {
|
||||
t.Errorf("Copy() error = %v, want error containing %q", err, tt.errMsg)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestCopyIOError tests the io.Copy error case
|
||||
func TestCopyIOError(t *testing.T) {
|
||||
// This is difficult to test without mocking io.Copy
|
||||
// The error case would occur if the source file is modified
|
||||
// or removed after opening but before copying
|
||||
t.Skip("Skipping io.Copy error test - requires file system race condition")
|
||||
}
|
||||
|
||||
// TestCopyCleanupOnError verifies that partially written files are cleaned up on error
|
||||
func TestCopyCleanupOnError(t *testing.T) {
|
||||
// Create a source file that we'll make unreadable after opening
|
||||
src, err := os.CreateTemp("", "copy-test-cleanup-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
defer os.Remove(src.Name())
|
||||
|
||||
// Write some content
|
||||
content := "test content for cleanup"
|
||||
if _, err := src.WriteString(content); err != nil {
|
||||
t.Fatalf("failed to write content: %v", err)
|
||||
}
|
||||
src.Close()
|
||||
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: src.Name(),
|
||||
CheatPath: "test",
|
||||
}
|
||||
|
||||
// Destination path
|
||||
dest := filepath.Join(os.TempDir(), "copy-cleanup-test.txt")
|
||||
defer os.Remove(dest) // Clean up if test fails
|
||||
|
||||
// Make the source file unreadable (simulating a read error during copy)
|
||||
// This is platform-specific, but should work on Unix-like systems
|
||||
if err := os.Chmod(src.Name(), 0000); err != nil {
|
||||
t.Skip("Cannot change file permissions on this platform")
|
||||
}
|
||||
defer os.Chmod(src.Name(), 0644) // Restore permissions for cleanup
|
||||
|
||||
// Attempt to copy - this should fail during io.Copy
|
||||
err = sheet.Copy(dest)
|
||||
if err == nil {
|
||||
t.Error("Expected Copy to fail with permission error")
|
||||
}
|
||||
|
||||
// Verify the destination file was cleaned up
|
||||
if _, err := os.Stat(dest); !os.IsNotExist(err) {
|
||||
t.Error("Destination file should have been removed after copy failure")
|
||||
}
|
||||
}
|
||||
|
||||
func contains(s, substr string) bool {
|
||||
return len(s) >= len(substr) && (s == substr || len(s) > 0 && containsHelper(s, substr))
|
||||
}
|
||||
|
||||
func containsHelper(s, substr string) bool {
|
||||
for i := 0; i <= len(s)-len(substr); i++ {
|
||||
if s[i:i+len(substr)] == substr {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
65
internal/sheet/doc.go
Normal file
65
internal/sheet/doc.go
Normal file
@@ -0,0 +1,65 @@
|
||||
// Package sheet provides functionality for parsing and managing individual cheat sheets.
|
||||
//
|
||||
// A sheet represents a single cheatsheet file containing helpful commands, notes,
|
||||
// or documentation. Sheets can include optional YAML frontmatter for metadata
|
||||
// such as tags and syntax highlighting preferences.
|
||||
//
|
||||
// # Sheet Format
|
||||
//
|
||||
// Sheets are plain text files that may begin with YAML frontmatter:
|
||||
//
|
||||
// ---
|
||||
// syntax: bash
|
||||
// tags: [networking, linux, ssh]
|
||||
// ---
|
||||
// # Connect to remote server
|
||||
// ssh user@hostname
|
||||
//
|
||||
// # Copy files over SSH
|
||||
// scp local_file user@hostname:/remote/path
|
||||
//
|
||||
// The frontmatter is optional. If omitted, the sheet will use default values.
|
||||
//
|
||||
// # Core Types
|
||||
//
|
||||
// The Sheet type contains:
|
||||
// - Title: The sheet's name (derived from filename)
|
||||
// - Path: Full filesystem path to the sheet
|
||||
// - Text: The content of the sheet (without frontmatter)
|
||||
// - Tags: Categories assigned to the sheet
|
||||
// - Syntax: Language hint for syntax highlighting
|
||||
// - ReadOnly: Whether the sheet can be modified
|
||||
//
|
||||
// Key Functions
|
||||
//
|
||||
// - New: Creates a new Sheet from a file path
|
||||
// - Parse: Extracts frontmatter and content from sheet text
|
||||
// - Search: Searches sheet content using regular expressions
|
||||
// - Colorize: Applies syntax highlighting to sheet content
|
||||
//
|
||||
// # Syntax Highlighting
|
||||
//
|
||||
// The package integrates with the Chroma library to provide syntax highlighting.
|
||||
// Supported languages include bash, python, go, javascript, and many others.
|
||||
// The syntax can be specified in the frontmatter or auto-detected.
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Load a sheet from disk
|
||||
// s, err := sheet.New("/path/to/sheet", []string{"personal"}, false)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Search for content
|
||||
// matches, err := s.Search("ssh", false)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Apply syntax highlighting
|
||||
// colorized, err := s.Colorize(config)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
package sheet
|
||||
@@ -5,7 +5,7 @@ import (
|
||||
"runtime"
|
||||
"strings"
|
||||
|
||||
"gopkg.in/yaml.v2"
|
||||
"gopkg.in/yaml.v3"
|
||||
)
|
||||
|
||||
// Parse parses cheatsheet frontmatter
|
||||
|
||||
54
internal/sheet/parse_extended_test.go
Normal file
54
internal/sheet/parse_extended_test.go
Normal file
@@ -0,0 +1,54 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"runtime"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestParseWindowsLineEndings tests parsing with Windows line endings
|
||||
func TestParseWindowsLineEndings(t *testing.T) {
|
||||
// Only test Windows line endings on Windows
|
||||
if runtime.GOOS != "windows" {
|
||||
t.Skip("Skipping Windows line ending test on non-Windows platform")
|
||||
}
|
||||
|
||||
// stub our cheatsheet content with Windows line endings
|
||||
markdown := "---\r\nsyntax: go\r\ntags: [ test ]\r\n---\r\nTo foo the bar: baz"
|
||||
|
||||
// parse the frontmatter
|
||||
fm, text, err := parse(markdown)
|
||||
|
||||
// assert expectations
|
||||
if err != nil {
|
||||
t.Errorf("failed to parse markdown: %v", err)
|
||||
}
|
||||
|
||||
want := "To foo the bar: baz"
|
||||
if text != want {
|
||||
t.Errorf("failed to parse text: want: %s, got: %s", want, text)
|
||||
}
|
||||
|
||||
want = "go"
|
||||
if fm.Syntax != want {
|
||||
t.Errorf("failed to parse syntax: want: %s, got: %s", want, fm.Syntax)
|
||||
}
|
||||
}
|
||||
|
||||
// TestParseInvalidYAML tests parsing with invalid YAML in frontmatter
|
||||
func TestParseInvalidYAML(t *testing.T) {
|
||||
// stub our cheatsheet content with invalid YAML
|
||||
markdown := `---
|
||||
syntax: go
|
||||
tags: [ test
|
||||
unclosed bracket
|
||||
---
|
||||
To foo the bar: baz`
|
||||
|
||||
// parse the frontmatter
|
||||
_, _, err := parse(markdown)
|
||||
|
||||
// assert that an error was returned for invalid YAML
|
||||
if err == nil {
|
||||
t.Error("expected error for invalid YAML, got nil")
|
||||
}
|
||||
}
|
||||
132
internal/sheet/parse_fuzz_test.go
Normal file
132
internal/sheet/parse_fuzz_test.go
Normal file
@@ -0,0 +1,132 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// FuzzParse tests the parse function with fuzzing to uncover edge cases
|
||||
// and potential panics in YAML frontmatter parsing
|
||||
func FuzzParse(f *testing.F) {
|
||||
// Add seed corpus with various valid and edge case inputs
|
||||
// Valid frontmatter
|
||||
f.Add("---\nsyntax: go\n---\nContent")
|
||||
f.Add("---\ntags: [a, b]\n---\n")
|
||||
f.Add("---\nsyntax: bash\ntags: [linux, shell]\n---\n#!/bin/bash\necho hello")
|
||||
|
||||
// No frontmatter
|
||||
f.Add("No frontmatter here")
|
||||
f.Add("")
|
||||
f.Add("Just plain text\nwith multiple lines")
|
||||
|
||||
// Edge cases with delimiters
|
||||
f.Add("---")
|
||||
f.Add("---\n")
|
||||
f.Add("---\n---")
|
||||
f.Add("---\n---\n")
|
||||
f.Add("---\n---\n---")
|
||||
f.Add("---\n---\n---\n---")
|
||||
f.Add("------\n------")
|
||||
|
||||
// Invalid YAML
|
||||
f.Add("---\n{invalid yaml\n---\n")
|
||||
f.Add("---\nsyntax: \"unclosed quote\n---\n")
|
||||
f.Add("---\ntags: [a, b,\n---\n")
|
||||
|
||||
// Windows line endings
|
||||
f.Add("---\r\nsyntax: go\r\n---\r\nContent")
|
||||
f.Add("---\r\n---\r\n")
|
||||
|
||||
// Mixed line endings
|
||||
f.Add("---\nsyntax: go\r\n---\nContent")
|
||||
f.Add("---\r\nsyntax: go\n---\r\nContent")
|
||||
|
||||
// Unicode and special characters
|
||||
f.Add("---\ntags: [emoji, 🎉]\n---\n")
|
||||
f.Add("---\nsyntax: 中文\n---\n")
|
||||
f.Add("---\ntags: [\x00, \x01]\n---\n")
|
||||
|
||||
// Very long inputs
|
||||
f.Add("---\ntags: [" + strings.Repeat("a,", 1000) + "a]\n---\n")
|
||||
f.Add("---\n" + strings.Repeat("field: value\n", 1000) + "---\n")
|
||||
|
||||
// Nested structures
|
||||
f.Add("---\ntags:\n - nested\n - list\n---\n")
|
||||
f.Add("---\nmeta:\n author: test\n version: 1.0\n---\n")
|
||||
|
||||
f.Fuzz(func(t *testing.T, input string) {
|
||||
// The parse function should never panic, regardless of input
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("parse panicked with input %q: %v", input, r)
|
||||
}
|
||||
}()
|
||||
|
||||
fm, text, err := parse(input)
|
||||
|
||||
// Verify invariants
|
||||
if err == nil {
|
||||
// If parsing succeeded, validate the result
|
||||
|
||||
// The returned text should be a suffix of the input
|
||||
// (either the whole input if no frontmatter, or the part after frontmatter)
|
||||
if !strings.HasSuffix(input, text) && text != input {
|
||||
t.Errorf("returned text %q is not a valid suffix of input %q", text, input)
|
||||
}
|
||||
|
||||
// If input starts with delimiter and has valid frontmatter,
|
||||
// text should be shorter than input
|
||||
if strings.HasPrefix(input, "---\n") || strings.HasPrefix(input, "---\r\n") {
|
||||
if len(fm.Tags) > 0 || fm.Syntax != "" {
|
||||
// We successfully parsed frontmatter, so text should be shorter
|
||||
if len(text) >= len(input) {
|
||||
t.Errorf("text length %d should be less than input length %d when frontmatter is parsed",
|
||||
len(text), len(input))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Note: Tags can be nil when frontmatter is not present or empty
|
||||
// This is expected behavior in Go for uninitialized slices
|
||||
} else {
|
||||
// If parsing failed, the original input should be returned as text
|
||||
if text != input {
|
||||
t.Errorf("on error, text should equal input: got %q, want %q", text, input)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzParseDelimiterHandling specifically tests delimiter edge cases
|
||||
func FuzzParseDelimiterHandling(f *testing.F) {
|
||||
// Seed corpus focusing on delimiter variations
|
||||
f.Add("---", "content")
|
||||
f.Add("", "---")
|
||||
f.Add("---", "---")
|
||||
f.Add("", "")
|
||||
|
||||
f.Fuzz(func(t *testing.T, prefix string, suffix string) {
|
||||
// Build input with controllable parts around delimiters
|
||||
inputs := []string{
|
||||
prefix + "---\n" + suffix,
|
||||
prefix + "---\r\n" + suffix,
|
||||
prefix + "---\n---\n" + suffix,
|
||||
prefix + "---\r\n---\r\n" + suffix,
|
||||
prefix + "---\n" + "yaml: data\n" + "---\n" + suffix,
|
||||
}
|
||||
|
||||
for _, input := range inputs {
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("parse panicked with constructed input: %v", r)
|
||||
}
|
||||
}()
|
||||
|
||||
_, _, _ = parse(input)
|
||||
}()
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -9,16 +9,17 @@ import (
|
||||
func (s *Sheet) Search(reg *regexp.Regexp) string {
|
||||
|
||||
// record matches
|
||||
matches := ""
|
||||
var matches []string
|
||||
|
||||
// search through the cheatsheet's text line by line
|
||||
for _, line := range strings.Split(s.Text, "\n\n") {
|
||||
|
||||
// exit early if the line doesn't match the regex
|
||||
// save matching lines
|
||||
if reg.MatchString(line) {
|
||||
matches += line + "\n\n"
|
||||
matches = append(matches, line)
|
||||
}
|
||||
}
|
||||
|
||||
return strings.TrimSpace(matches)
|
||||
// Join matches with the same delimiter used for splitting
|
||||
return strings.Join(matches, "\n\n")
|
||||
}
|
||||
|
||||
190
internal/sheet/search_fuzz_test.go
Normal file
190
internal/sheet/search_fuzz_test.go
Normal file
@@ -0,0 +1,190 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"regexp"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
)
|
||||
|
||||
// FuzzSearchRegex tests the regex compilation and search functionality
|
||||
// to ensure it handles malformed patterns gracefully and doesn't suffer
|
||||
// from catastrophic backtracking
|
||||
func FuzzSearchRegex(f *testing.F) {
|
||||
// Add seed corpus with various regex patterns
|
||||
// Valid patterns
|
||||
f.Add("test", "This is a test string")
|
||||
f.Add("(?i)test", "This is a TEST string")
|
||||
f.Add("foo|bar", "foo and bar")
|
||||
f.Add("^start", "start of line\nnext line")
|
||||
f.Add("end$", "at the end\nnext line")
|
||||
f.Add("\\d+", "123 numbers 456")
|
||||
f.Add("[a-z]+", "lowercase UPPERCASE")
|
||||
|
||||
// Edge cases and potentially problematic patterns
|
||||
f.Add("", "empty pattern")
|
||||
f.Add(".", "any character")
|
||||
f.Add(".*", "match everything")
|
||||
f.Add(".+", "match something")
|
||||
f.Add("\\", "backslash")
|
||||
f.Add("(", "unclosed paren")
|
||||
f.Add(")", "unmatched paren")
|
||||
f.Add("[", "unclosed bracket")
|
||||
f.Add("]", "unmatched bracket")
|
||||
f.Add("[^]", "negated empty class")
|
||||
f.Add("(?", "incomplete group")
|
||||
|
||||
// Patterns that might cause performance issues
|
||||
f.Add("(a+)+", "aaaaaaaaaaaaaaaaaaaaaaaab")
|
||||
f.Add("(a*)*", "aaaaaaaaaaaaaaaaaaaaaaaab")
|
||||
f.Add("(a|a)*", "aaaaaaaaaaaaaaaaaaaaaaaab")
|
||||
f.Add("(.*)*", "any text here")
|
||||
f.Add("(\\d+)+", "123456789012345678901234567890x")
|
||||
|
||||
// Unicode patterns
|
||||
f.Add("☺", "Unicode ☺ smiley")
|
||||
f.Add("[一-龯]", "Chinese 中文 characters")
|
||||
f.Add("\\p{L}+", "Unicode letters")
|
||||
|
||||
// Very long patterns
|
||||
f.Add(strings.Repeat("a", 1000), "long pattern")
|
||||
f.Add(strings.Repeat("(a|b)", 100), "complex pattern")
|
||||
|
||||
f.Fuzz(func(t *testing.T, pattern string, text string) {
|
||||
// Test 1: Regex compilation should not panic
|
||||
var reg *regexp.Regexp
|
||||
var compileErr error
|
||||
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("regexp.Compile panicked with pattern %q: %v", pattern, r)
|
||||
}
|
||||
}()
|
||||
|
||||
reg, compileErr = regexp.Compile(pattern)
|
||||
}()
|
||||
|
||||
// If compilation failed, that's OK - we're testing error handling
|
||||
if compileErr != nil {
|
||||
// This is expected for invalid patterns
|
||||
return
|
||||
}
|
||||
|
||||
// Test 2: Create a sheet and test Search method
|
||||
sheet := Sheet{
|
||||
Title: "test",
|
||||
Text: text,
|
||||
}
|
||||
|
||||
// Search should not panic
|
||||
var result string
|
||||
done := make(chan bool, 1)
|
||||
|
||||
go func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Search panicked with pattern %q on text %q: %v", pattern, text, r)
|
||||
}
|
||||
done <- true
|
||||
}()
|
||||
|
||||
result = sheet.Search(reg)
|
||||
}()
|
||||
|
||||
// Timeout after 100ms to catch catastrophic backtracking
|
||||
select {
|
||||
case <-done:
|
||||
// Search completed successfully
|
||||
case <-time.After(100 * time.Millisecond):
|
||||
t.Errorf("Search timed out (possible catastrophic backtracking) with pattern %q on text %q", pattern, text)
|
||||
}
|
||||
|
||||
// Test 3: Verify search result invariants
|
||||
if result != "" {
|
||||
// The Search function splits by "\n\n", so we need to compare using the same logic
|
||||
resultLines := strings.Split(result, "\n\n")
|
||||
textLines := strings.Split(text, "\n\n")
|
||||
|
||||
// Every result line should exist in the original text lines
|
||||
for _, rLine := range resultLines {
|
||||
found := false
|
||||
for _, tLine := range textLines {
|
||||
if rLine == tLine {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found && rLine != "" {
|
||||
t.Errorf("Search result contains line not in original text: %q", rLine)
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzSearchCatastrophicBacktracking specifically tests for regex patterns
|
||||
// that could cause performance issues
|
||||
func FuzzSearchCatastrophicBacktracking(f *testing.F) {
|
||||
// Seed with patterns known to potentially cause issues
|
||||
f.Add("a", 10, 5)
|
||||
f.Add("x", 20, 3)
|
||||
|
||||
f.Fuzz(func(t *testing.T, char string, repeats int, groups int) {
|
||||
// Limit the size to avoid memory issues in the test
|
||||
if repeats > 30 || repeats < 0 || groups > 10 || groups < 0 || len(char) > 5 {
|
||||
t.Skip("Skipping invalid or overly large test case")
|
||||
}
|
||||
|
||||
// Construct patterns that might cause backtracking
|
||||
patterns := []string{
|
||||
strings.Repeat(char, repeats),
|
||||
"(" + char + "+)+",
|
||||
"(" + char + "*)*",
|
||||
"(" + char + "|" + char + ")*",
|
||||
}
|
||||
|
||||
// Add nested groups
|
||||
if groups > 0 && groups < 10 {
|
||||
nested := char
|
||||
for i := 0; i < groups; i++ {
|
||||
nested = "(" + nested + ")+"
|
||||
}
|
||||
patterns = append(patterns, nested)
|
||||
}
|
||||
|
||||
// Test text that might trigger backtracking
|
||||
testText := strings.Repeat(char, repeats) + "x"
|
||||
|
||||
for _, pattern := range patterns {
|
||||
// Try to compile the pattern
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
// Invalid pattern, skip
|
||||
continue
|
||||
}
|
||||
|
||||
// Test with timeout
|
||||
done := make(chan bool, 1)
|
||||
|
||||
go func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Search panicked with backtracking pattern %q: %v", pattern, r)
|
||||
}
|
||||
done <- true
|
||||
}()
|
||||
|
||||
sheet := Sheet{Text: testText}
|
||||
_ = sheet.Search(reg)
|
||||
}()
|
||||
|
||||
select {
|
||||
case <-done:
|
||||
// Completed successfully
|
||||
case <-time.After(50 * time.Millisecond):
|
||||
t.Logf("Warning: potential backtracking issue with pattern %q (completed slowly)", pattern)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
94
internal/sheet/tagged_fuzz_test.go
Normal file
94
internal/sheet/tagged_fuzz_test.go
Normal file
@@ -0,0 +1,94 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// FuzzTagged tests the Tagged function with potentially malicious tag inputs
|
||||
//
|
||||
// Threat model: An attacker crafts a malicious cheatsheet with specially
|
||||
// crafted tags that could cause issues when a user searches/filters by tags.
|
||||
// This is particularly relevant for shared community cheatsheets.
|
||||
func FuzzTagged(f *testing.F) {
|
||||
// Add seed corpus with potentially problematic inputs
|
||||
// These represent tags an attacker might use in a malicious cheatsheet
|
||||
f.Add("normal", "normal")
|
||||
f.Add("", "")
|
||||
f.Add(" ", " ")
|
||||
f.Add("\n", "\n")
|
||||
f.Add("\r\n", "\r\n")
|
||||
f.Add("\x00", "\x00") // Null byte
|
||||
f.Add("../../etc/passwd", "../../etc/passwd") // Path traversal attempt
|
||||
f.Add("'; DROP TABLE sheets;--", "sql") // SQL injection attempt
|
||||
f.Add("<script>alert('xss')</script>", "xss") // XSS attempt
|
||||
f.Add("${HOME}", "${HOME}") // Environment variable
|
||||
f.Add("$(whoami)", "$(whoami)") // Command substitution
|
||||
f.Add("`date`", "`date`") // Command substitution
|
||||
f.Add("\\x41\\x42", "\\x41\\x42") // Escape sequences
|
||||
f.Add("%00", "%00") // URL encoded null
|
||||
f.Add("tag\nwith\nnewlines", "tag")
|
||||
f.Add(strings.Repeat("a", 10000), "a") // Very long tag
|
||||
f.Add("🎉", "🎉") // Unicode
|
||||
f.Add("\U0001F4A9", "\U0001F4A9") // Unicode poop emoji
|
||||
f.Add("tag with spaces", "tag with spaces")
|
||||
f.Add("TAG", "tag") // Case sensitivity check
|
||||
f.Add("tag", "TAG") // Case sensitivity check
|
||||
|
||||
f.Fuzz(func(t *testing.T, sheetTag string, searchTag string) {
|
||||
// Create a sheet with the potentially malicious tag
|
||||
sheet := Sheet{
|
||||
Title: "test",
|
||||
Tags: []string{sheetTag},
|
||||
}
|
||||
|
||||
// The Tagged function should never panic regardless of input
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tagged panicked with sheetTag=%q, searchTag=%q: %v",
|
||||
sheetTag, searchTag, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := sheet.Tagged(searchTag)
|
||||
|
||||
// Verify the result is consistent with a simple string comparison
|
||||
expected := false
|
||||
for _, tag := range sheet.Tags {
|
||||
if tag == searchTag {
|
||||
expected = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if result != expected {
|
||||
t.Errorf("Tagged returned %v but expected %v for sheetTag=%q, searchTag=%q",
|
||||
result, expected, sheetTag, searchTag)
|
||||
}
|
||||
|
||||
// Additional invariant: Tagged should be case-sensitive
|
||||
if sheetTag != searchTag && result {
|
||||
t.Errorf("Tagged matched different strings: sheetTag=%q, searchTag=%q",
|
||||
sheetTag, searchTag)
|
||||
}
|
||||
}()
|
||||
|
||||
// Test with multiple tags including the fuzzed one
|
||||
sheetMulti := Sheet{
|
||||
Title: "test",
|
||||
Tags: []string{"safe1", sheetTag, "safe2", sheetTag}, // Duplicate tags
|
||||
}
|
||||
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tagged panicked with multiple tags including %q: %v",
|
||||
sheetTag, r)
|
||||
}
|
||||
}()
|
||||
|
||||
_ = sheetMulti.Tagged(searchTag)
|
||||
}()
|
||||
})
|
||||
}
|
||||
4
internal/sheet/testdata/fuzz/FuzzSearchCatastrophicBacktracking/3ad1cae1b78a2478
vendored
Normal file
4
internal/sheet/testdata/fuzz/FuzzSearchCatastrophicBacktracking/3ad1cae1b78a2478
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
go test fuzz v1
|
||||
string("0")
|
||||
int(-6)
|
||||
int(5)
|
||||
3
internal/sheet/testdata/fuzz/FuzzSearchRegex/74c0a5e8e3464bfd
vendored
Normal file
3
internal/sheet/testdata/fuzz/FuzzSearchRegex/74c0a5e8e3464bfd
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
go test fuzz v1
|
||||
string(".")
|
||||
string(" 0000\n\n\n\n00000")
|
||||
65
internal/sheets/doc.go
Normal file
65
internal/sheets/doc.go
Normal file
@@ -0,0 +1,65 @@
|
||||
// Package sheets manages collections of cheat sheets across multiple cheatpaths.
|
||||
//
|
||||
// The sheets package provides functionality to:
|
||||
// - Load sheets from multiple cheatpaths
|
||||
// - Consolidate duplicate sheets (with precedence rules)
|
||||
// - Filter sheets by tags
|
||||
// - Sort sheets alphabetically
|
||||
// - Extract unique tags across all sheets
|
||||
//
|
||||
// # Loading Sheets
|
||||
//
|
||||
// Sheets are loaded recursively from cheatpath directories, excluding:
|
||||
// - Hidden files (starting with .)
|
||||
// - Files in .git directories
|
||||
// - Files with extensions (sheets have no extension)
|
||||
//
|
||||
// # Consolidation
|
||||
//
|
||||
// When multiple cheatpaths contain sheets with the same name, consolidation
|
||||
// rules apply based on the order of cheatpaths. Sheets from earlier paths
|
||||
// override those from later paths, allowing personal sheets to override
|
||||
// community sheets.
|
||||
//
|
||||
// Example:
|
||||
//
|
||||
// cheatpaths:
|
||||
// 1. personal: ~/cheat
|
||||
// 2. community: ~/cheat/community
|
||||
//
|
||||
// If both contain "git", the version from "personal" is used.
|
||||
//
|
||||
// # Filtering
|
||||
//
|
||||
// Sheets can be filtered by:
|
||||
// - Tags: Include only sheets with specific tags
|
||||
// - Cheatpath: Include only sheets from specific paths
|
||||
//
|
||||
// Key Functions
|
||||
//
|
||||
// - Load: Loads all sheets from the given cheatpaths
|
||||
// - Filter: Filters sheets by tag
|
||||
// - Consolidate: Merges sheets from multiple paths with precedence
|
||||
// - Sort: Sorts sheets alphabetically by title
|
||||
// - Tags: Extracts all unique tags from sheets
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Load sheets from all cheatpaths
|
||||
// allSheets, err := sheets.Load(cheatpaths)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Consolidate to handle duplicates
|
||||
// consolidated := sheets.Consolidate(allSheets)
|
||||
//
|
||||
// // Filter by tag
|
||||
// filtered := sheets.Filter(consolidated, "networking")
|
||||
//
|
||||
// // Sort alphabetically
|
||||
// sheets.Sort(filtered)
|
||||
//
|
||||
// // Get all unique tags
|
||||
// tags := sheets.Tags(consolidated)
|
||||
package sheets
|
||||
@@ -2,6 +2,7 @@ package sheets
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"unicode/utf8"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
@@ -31,7 +32,8 @@ func Filter(
|
||||
// iterate over each tag. If the sheet does not match *all* tags, filter
|
||||
// it out.
|
||||
for _, tag := range tags {
|
||||
if !sheet.Tagged(strings.TrimSpace(tag)) {
|
||||
trimmed := strings.TrimSpace(tag)
|
||||
if trimmed == "" || !utf8.ValidString(trimmed) || !sheet.Tagged(trimmed) {
|
||||
keep = false
|
||||
}
|
||||
}
|
||||
|
||||
177
internal/sheets/filter_fuzz_test.go
Normal file
177
internal/sheets/filter_fuzz_test.go
Normal file
@@ -0,0 +1,177 @@
|
||||
package sheets
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
|
||||
// FuzzFilter tests the Filter function with various tag combinations
|
||||
func FuzzFilter(f *testing.F) {
|
||||
// Add seed corpus with various tag scenarios
|
||||
// Format: "tags to filter by" (comma-separated)
|
||||
f.Add("linux")
|
||||
f.Add("linux,bash")
|
||||
f.Add("linux,bash,ssh")
|
||||
f.Add("")
|
||||
f.Add(" ")
|
||||
f.Add(" linux ")
|
||||
f.Add("linux,")
|
||||
f.Add(",linux")
|
||||
f.Add(",,")
|
||||
f.Add("linux,,bash")
|
||||
f.Add("tag-with-dash")
|
||||
f.Add("tag_with_underscore")
|
||||
f.Add("UPPERCASE")
|
||||
f.Add("miXedCase")
|
||||
f.Add("🎉emoji")
|
||||
f.Add("tag with spaces")
|
||||
f.Add("\ttab\ttag")
|
||||
f.Add("tag\nwith\nnewline")
|
||||
f.Add("very-long-tag-name-that-might-cause-issues-somewhere")
|
||||
f.Add(strings.Repeat("a,", 100) + "a")
|
||||
|
||||
f.Fuzz(func(t *testing.T, tagString string) {
|
||||
// Split the tag string into individual tags
|
||||
var tags []string
|
||||
if tagString != "" {
|
||||
tags = strings.Split(tagString, ",")
|
||||
}
|
||||
|
||||
// Create test data - some sheets with various tags
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
{
|
||||
"sheet1": sheet.Sheet{
|
||||
Title: "sheet1",
|
||||
Tags: []string{"linux", "bash"},
|
||||
},
|
||||
"sheet2": sheet.Sheet{
|
||||
Title: "sheet2",
|
||||
Tags: []string{"linux", "ssh", "networking"},
|
||||
},
|
||||
"sheet3": sheet.Sheet{
|
||||
Title: "sheet3",
|
||||
Tags: []string{"UPPERCASE", "miXedCase"},
|
||||
},
|
||||
},
|
||||
{
|
||||
"sheet4": sheet.Sheet{
|
||||
Title: "sheet4",
|
||||
Tags: []string{"tag with spaces", "🎉emoji"},
|
||||
},
|
||||
"sheet5": sheet.Sheet{
|
||||
Title: "sheet5",
|
||||
Tags: []string{}, // No tags
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
// The function should not panic
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Filter panicked with tags %q: %v", tags, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Filter(cheatpaths, tags)
|
||||
|
||||
// Verify invariants
|
||||
// 1. Result should have same number of cheatpaths
|
||||
if len(result) != len(cheatpaths) {
|
||||
t.Errorf("Filter changed number of cheatpaths: got %d, want %d",
|
||||
len(result), len(cheatpaths))
|
||||
}
|
||||
|
||||
// 2. Each filtered sheet should contain all requested tags
|
||||
for _, filteredPath := range result {
|
||||
for title, sheet := range filteredPath {
|
||||
// Verify this sheet has all the tags we filtered for
|
||||
for _, tag := range tags {
|
||||
trimmedTag := strings.TrimSpace(tag)
|
||||
if trimmedTag == "" {
|
||||
continue // Skip empty tags
|
||||
}
|
||||
if !sheet.Tagged(trimmedTag) {
|
||||
t.Errorf("Sheet %q passed filter but doesn't have tag %q",
|
||||
title, trimmedTag)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Empty tag list should return all sheets
|
||||
if len(tags) == 0 || (len(tags) == 1 && tags[0] == "") {
|
||||
totalOriginal := 0
|
||||
totalFiltered := 0
|
||||
for _, path := range cheatpaths {
|
||||
totalOriginal += len(path)
|
||||
}
|
||||
for _, path := range result {
|
||||
totalFiltered += len(path)
|
||||
}
|
||||
if totalFiltered != totalOriginal {
|
||||
t.Errorf("Empty filter should return all sheets: got %d, want %d",
|
||||
totalFiltered, totalOriginal)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzFilterEdgeCases tests Filter with extreme inputs
|
||||
func FuzzFilterEdgeCases(f *testing.F) {
|
||||
// Seed with number of tags and tag length
|
||||
f.Add(0, 0)
|
||||
f.Add(1, 10)
|
||||
f.Add(10, 10)
|
||||
f.Add(100, 5)
|
||||
f.Add(1000, 3)
|
||||
|
||||
f.Fuzz(func(t *testing.T, numTags int, tagLen int) {
|
||||
// Limit to reasonable values to avoid memory issues
|
||||
if numTags > 1000 || numTags < 0 || tagLen > 100 || tagLen < 0 {
|
||||
t.Skip("Skipping unreasonable test case")
|
||||
}
|
||||
|
||||
// Generate tags
|
||||
tags := make([]string, numTags)
|
||||
for i := 0; i < numTags; i++ {
|
||||
// Create a tag of specified length
|
||||
if tagLen > 0 {
|
||||
tags[i] = strings.Repeat("a", tagLen) + string(rune(i%26+'a'))
|
||||
}
|
||||
}
|
||||
|
||||
// Create a sheet with no tags (should be filtered out)
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
{
|
||||
"test": sheet.Sheet{
|
||||
Title: "test",
|
||||
Tags: []string{},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
// Should not panic with many tags
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Filter panicked with %d tags of length %d: %v",
|
||||
numTags, tagLen, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Filter(cheatpaths, tags)
|
||||
|
||||
// With non-matching tags, result should be empty
|
||||
if numTags > 0 && tagLen > 0 {
|
||||
if len(result[0]) != 0 {
|
||||
t.Errorf("Expected empty result with non-matching tags, got %d sheets",
|
||||
len(result[0]))
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
@@ -20,7 +20,7 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
sheets := make([]map[string]sheet.Sheet, len(cheatpaths))
|
||||
|
||||
// iterate over each cheatpath
|
||||
for _, cheatpath := range cheatpaths {
|
||||
for i, cheatpath := range cheatpaths {
|
||||
|
||||
// vivify the map of cheatsheets on this specific cheatpath
|
||||
pathsheets := make(map[string]sheet.Sheet)
|
||||
@@ -43,6 +43,19 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
return nil
|
||||
}
|
||||
|
||||
// get the base filename
|
||||
filename := filepath.Base(path)
|
||||
|
||||
// skip hidden files (files that start with a dot)
|
||||
if strings.HasPrefix(filename, ".") {
|
||||
return nil
|
||||
}
|
||||
|
||||
// skip files with extensions (cheatsheets have no extension)
|
||||
if filepath.Ext(filename) != "" {
|
||||
return nil
|
||||
}
|
||||
|
||||
// calculate the cheatsheet's "title" (the phrase with which it may be
|
||||
// accessed. Eg: `cheat tar` - `tar` is the title)
|
||||
title := strings.TrimPrefix(
|
||||
@@ -88,7 +101,7 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
|
||||
// store the sheets on this cheatpath alongside the other cheatsheets on
|
||||
// other cheatpaths
|
||||
sheets = append(sheets, pathsheets)
|
||||
sheets[i] = pathsheets
|
||||
}
|
||||
|
||||
// return the cheatsheets, grouped by cheatpath
|
||||
|
||||
@@ -26,19 +26,26 @@ func TestLoad(t *testing.T) {
|
||||
}
|
||||
|
||||
// load cheatsheets
|
||||
sheets, err := Load(cheatpaths)
|
||||
cheatpathSheets, err := Load(cheatpaths)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load cheatsheets: %v", err)
|
||||
}
|
||||
|
||||
// assert that the correct number of sheets loaded
|
||||
// (sheet load details are tested in `sheet_test.go`)
|
||||
totalSheets := 0
|
||||
for _, sheets := range cheatpathSheets {
|
||||
totalSheets += len(sheets)
|
||||
}
|
||||
|
||||
// we expect 4 total sheets (2 from community, 2 from personal)
|
||||
// hidden files and files with extensions are excluded
|
||||
want := 4
|
||||
if len(sheets) != want {
|
||||
if totalSheets != want {
|
||||
t.Errorf(
|
||||
"failed to load correct number of cheatsheets: want: %d, got: %d",
|
||||
want,
|
||||
len(sheets),
|
||||
totalSheets,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ package sheets
|
||||
|
||||
import (
|
||||
"sort"
|
||||
"unicode/utf8"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
@@ -16,10 +17,13 @@ func Tags(cheatpaths []map[string]sheet.Sheet) []string {
|
||||
for _, path := range cheatpaths {
|
||||
for _, sheet := range path {
|
||||
for _, tag := range sheet.Tags {
|
||||
// Skip invalid UTF-8 tags to prevent downstream issues
|
||||
if utf8.ValidString(tag) {
|
||||
tags[tag] = true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// restructure the map into a slice
|
||||
sorted := []string{}
|
||||
|
||||
190
internal/sheets/tags_fuzz_test.go
Normal file
190
internal/sheets/tags_fuzz_test.go
Normal file
@@ -0,0 +1,190 @@
|
||||
package sheets
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
"unicode/utf8"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
|
||||
// FuzzTags tests the Tags function with various tag combinations
|
||||
func FuzzTags(f *testing.F) {
|
||||
// Add seed corpus
|
||||
// Format: comma-separated tags that will be distributed across sheets
|
||||
f.Add("linux,bash,ssh")
|
||||
f.Add("")
|
||||
f.Add("single")
|
||||
f.Add("duplicate,duplicate,duplicate")
|
||||
f.Add(" spaces , around , tags ")
|
||||
f.Add("MiXeD,UPPER,lower")
|
||||
f.Add("special-chars,under_score,dot.ted")
|
||||
f.Add("emoji🎉,unicode中文,symbols@#$")
|
||||
f.Add("\ttab,\nnewline,\rcarriage")
|
||||
f.Add(",,,,") // Multiple empty tags
|
||||
f.Add(strings.Repeat("tag,", 100)) // Many tags
|
||||
f.Add("a," + strings.Repeat("very-long-tag-name", 10)) // Long tag names
|
||||
|
||||
f.Fuzz(func(t *testing.T, tagString string) {
|
||||
// Split tags and distribute them across multiple sheets
|
||||
var allTags []string
|
||||
if tagString != "" {
|
||||
allTags = strings.Split(tagString, ",")
|
||||
}
|
||||
|
||||
// Create test cheatpaths with various tag distributions
|
||||
cheatpaths := []map[string]sheet.Sheet{}
|
||||
|
||||
// Distribute tags across 3 paths with overlapping tags
|
||||
for i := 0; i < 3; i++ {
|
||||
path := make(map[string]sheet.Sheet)
|
||||
|
||||
// Each path gets some subset of tags
|
||||
for j, tag := range allTags {
|
||||
if j%3 == i || j%(i+2) == 0 { // Create some overlap
|
||||
sheetName := string(rune('a' + j%26))
|
||||
path[sheetName] = sheet.Sheet{
|
||||
Title: sheetName,
|
||||
Tags: []string{tag},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add a sheet with multiple tags
|
||||
if len(allTags) > 1 {
|
||||
path["multi"] = sheet.Sheet{
|
||||
Title: "multi",
|
||||
Tags: allTags[:len(allTags)/2+1], // First half of tags
|
||||
}
|
||||
}
|
||||
|
||||
cheatpaths = append(cheatpaths, path)
|
||||
}
|
||||
|
||||
// The function should not panic
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tags panicked with input %q: %v", tagString, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Tags(cheatpaths)
|
||||
|
||||
// Verify invariants
|
||||
// 1. Result should be sorted
|
||||
for i := 1; i < len(result); i++ {
|
||||
if result[i-1] >= result[i] {
|
||||
t.Errorf("Tags not sorted: %q >= %q at positions %d, %d",
|
||||
result[i-1], result[i], i-1, i)
|
||||
}
|
||||
}
|
||||
|
||||
// 2. No duplicates in result
|
||||
seen := make(map[string]bool)
|
||||
for _, tag := range result {
|
||||
if seen[tag] {
|
||||
t.Errorf("Duplicate tag in result: %q", tag)
|
||||
}
|
||||
seen[tag] = true
|
||||
}
|
||||
|
||||
// 3. All non-empty tags from input should be in result
|
||||
// (This is approximate since we distributed tags in a complex way)
|
||||
inputTags := make(map[string]bool)
|
||||
for _, tag := range allTags {
|
||||
if tag != "" {
|
||||
inputTags[tag] = true
|
||||
}
|
||||
}
|
||||
|
||||
resultTags := make(map[string]bool)
|
||||
for _, tag := range result {
|
||||
resultTags[tag] = true
|
||||
}
|
||||
|
||||
// Result might have fewer tags due to distribution logic,
|
||||
// but shouldn't have tags not in the input
|
||||
for tag := range resultTags {
|
||||
found := false
|
||||
for inputTag := range inputTags {
|
||||
if tag == inputTag {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found && tag != "" {
|
||||
t.Errorf("Result contains tag %q not derived from input", tag)
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Valid UTF-8 (Tags function should filter out invalid UTF-8)
|
||||
for _, tag := range result {
|
||||
if !utf8.ValidString(tag) {
|
||||
t.Errorf("Invalid UTF-8 in tag: %q", tag)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzTagsStress tests Tags function with large numbers of tags
|
||||
func FuzzTagsStress(f *testing.F) {
|
||||
// Seed: number of unique tags, number of sheets, tags per sheet
|
||||
f.Add(10, 10, 5)
|
||||
f.Add(100, 50, 10)
|
||||
f.Add(1000, 100, 20)
|
||||
|
||||
f.Fuzz(func(t *testing.T, numUniqueTags int, numSheets int, tagsPerSheet int) {
|
||||
// Limit to reasonable values
|
||||
if numUniqueTags > 1000 || numUniqueTags < 0 ||
|
||||
numSheets > 1000 || numSheets < 0 ||
|
||||
tagsPerSheet > 100 || tagsPerSheet < 0 {
|
||||
t.Skip("Skipping unreasonable test case")
|
||||
}
|
||||
|
||||
// Generate unique tags
|
||||
uniqueTags := make([]string, numUniqueTags)
|
||||
for i := 0; i < numUniqueTags; i++ {
|
||||
uniqueTags[i] = "tag" + string(rune(i))
|
||||
}
|
||||
|
||||
// Create sheets with random tags
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
make(map[string]sheet.Sheet),
|
||||
}
|
||||
|
||||
for i := 0; i < numSheets; i++ {
|
||||
// Select random tags for this sheet
|
||||
sheetTags := make([]string, 0, tagsPerSheet)
|
||||
for j := 0; j < tagsPerSheet && j < numUniqueTags; j++ {
|
||||
// Distribute tags across sheets
|
||||
tagIndex := (i*tagsPerSheet + j) % numUniqueTags
|
||||
sheetTags = append(sheetTags, uniqueTags[tagIndex])
|
||||
}
|
||||
|
||||
cheatpaths[0]["sheet"+string(rune(i))] = sheet.Sheet{
|
||||
Title: "sheet" + string(rune(i)),
|
||||
Tags: sheetTags,
|
||||
}
|
||||
}
|
||||
|
||||
// Should handle large numbers efficiently
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tags panicked with %d unique tags, %d sheets, %d tags/sheet: %v",
|
||||
numUniqueTags, numSheets, tagsPerSheet, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Tags(cheatpaths)
|
||||
|
||||
// Should have at most numUniqueTags in result
|
||||
if len(result) > numUniqueTags {
|
||||
t.Errorf("More tags in result (%d) than unique tags created (%d)",
|
||||
len(result), numUniqueTags)
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
2
internal/sheets/testdata/fuzz/FuzzFilter/4316c263ab833860
vendored
Normal file
2
internal/sheets/testdata/fuzz/FuzzFilter/4316c263ab833860
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
go test fuzz v1
|
||||
string("\xd7")
|
||||
2
internal/sheets/testdata/fuzz/FuzzTags/28f36ef487f23e6c
vendored
Normal file
2
internal/sheets/testdata/fuzz/FuzzTags/28f36ef487f23e6c
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
go test fuzz v1
|
||||
string("\xf0")
|
||||
19
vendor/github.com/alecthomas/chroma/Makefile
generated
vendored
19
vendor/github.com/alecthomas/chroma/Makefile
generated
vendored
@@ -1,19 +0,0 @@
|
||||
.PHONY: chromad upload all
|
||||
|
||||
VERSION ?= $(shell git describe --tags --dirty --always)
|
||||
|
||||
all: README.md tokentype_string.go
|
||||
|
||||
README.md: lexers/*/*.go
|
||||
./table.py
|
||||
|
||||
tokentype_string.go: types.go
|
||||
go generate
|
||||
|
||||
chromad:
|
||||
rm -f chromad
|
||||
(export CGOENABLED=0 GOOS=linux GOARCH=amd64; cd ./cmd/chromad && go build -ldflags="-X 'main.version=$(VERSION)'" -o ../../chromad .)
|
||||
|
||||
upload: chromad
|
||||
scp chromad root@swapoff.org: && \
|
||||
ssh root@swapoff.org 'install -m755 ./chromad /srv/http/swapoff.org/bin && service chromad restart'
|
||||
285
vendor/github.com/alecthomas/chroma/README.md
generated
vendored
285
vendor/github.com/alecthomas/chroma/README.md
generated
vendored
@@ -1,285 +0,0 @@
|
||||
# Chroma — A general purpose syntax highlighter in pure Go
|
||||
[](https://godoc.org/github.com/alecthomas/chroma) [](https://github.com/alecthomas/chroma/actions/workflows/ci.yml) [](https://invite.slack.golangbridge.org/)
|
||||
|
||||
> **NOTE:** As Chroma has just been released, its API is still in flux. That said, the high-level interface should not change significantly.
|
||||
|
||||
Chroma takes source code and other structured text and converts it into syntax
|
||||
highlighted HTML, ANSI-coloured text, etc.
|
||||
|
||||
Chroma is based heavily on [Pygments](http://pygments.org/), and includes
|
||||
translators for Pygments lexers and styles.
|
||||
|
||||
<a id="markdown-table-of-contents" name="table-of-contents"></a>
|
||||
## Table of Contents
|
||||
|
||||
<!-- TOC -->
|
||||
|
||||
1. [Table of Contents](#table-of-contents)
|
||||
2. [Supported languages](#supported-languages)
|
||||
3. [Try it](#try-it)
|
||||
4. [Using the library](#using-the-library)
|
||||
1. [Quick start](#quick-start)
|
||||
2. [Identifying the language](#identifying-the-language)
|
||||
3. [Formatting the output](#formatting-the-output)
|
||||
4. [The HTML formatter](#the-html-formatter)
|
||||
5. [More detail](#more-detail)
|
||||
1. [Lexers](#lexers)
|
||||
2. [Formatters](#formatters)
|
||||
3. [Styles](#styles)
|
||||
6. [Command-line interface](#command-line-interface)
|
||||
7. [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
|
||||
|
||||
<!-- /TOC -->
|
||||
|
||||
<a id="markdown-supported-languages" name="supported-languages"></a>
|
||||
## Supported languages
|
||||
|
||||
Prefix | Language
|
||||
:----: | --------
|
||||
A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Angular2, ANTLR, ApacheConf, APL, AppleScript, Arduino, Awk
|
||||
B | Ballerina, Base Makefile, Bash, Batchfile, BibTeX, Bicep, BlitzBasic, BNF, Brainfuck
|
||||
C | C, C#, C++, Caddyfile, Caddyfile Directives, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython
|
||||
D | D, Dart, Diff, Django/Jinja, Docker, DTD, Dylan
|
||||
E | EBNF, Elixir, Elm, EmacsLisp, Erlang
|
||||
F | Factor, Fish, Forth, Fortran, FSharp
|
||||
G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, Gherkin, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groff, Groovy
|
||||
H | Handlebars, Haskell, Haxe, HCL, Hexdump, HLB, HTML, HTTP, Hy
|
||||
I | Idris, Igor, INI, Io
|
||||
J | J, Java, JavaScript, JSON, Julia, Jungle
|
||||
K | Kotlin
|
||||
L | Lighttpd configuration file, LLVM, Lua
|
||||
M | Mako, markdown, Mason, Mathematica, Matlab, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL
|
||||
N | NASM, Newspeak, Nginx configuration file, Nim, Nix
|
||||
O | Objective-C, OCaml, Octave, OnesEnterprise, OpenEdge ABL, OpenSCAD, Org Mode
|
||||
P | PacmanConf, Perl, PHP, PHTML, Pig, PkgConfig, PL/pgSQL, plaintext, Pony, PostgreSQL SQL dialect, PostScript, POVRay, PowerShell, Prolog, PromQL, Protocol Buffer, Puppet, Python 2, Python
|
||||
Q | QBasic
|
||||
R | R, Racket, Ragel, Raku, react, ReasonML, reg, reStructuredText, Rexx, Ruby, Rust
|
||||
S | SAS, Sass, Scala, Scheme, Scilab, SCSS, Smalltalk, Smarty, Snobol, Solidity, SPARQL, SQL, SquidConf, Standard ML, Stylus, Svelte, Swift, SYSTEMD, systemverilog
|
||||
T | TableGen, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData
|
||||
V | VB.net, verilog, VHDL, VimL, vue
|
||||
W | WDTE
|
||||
X | XML, Xorg
|
||||
Y | YAML, YANG
|
||||
Z | Zig
|
||||
|
||||
|
||||
_I will attempt to keep this section up to date, but an authoritative list can be
|
||||
displayed with `chroma --list`._
|
||||
|
||||
<a id="markdown-try-it" name="try-it"></a>
|
||||
## Try it
|
||||
|
||||
Try out various languages and styles on the [Chroma Playground](https://swapoff.org/chroma/playground/).
|
||||
|
||||
<a id="markdown-using-the-library" name="using-the-library"></a>
|
||||
## Using the library
|
||||
|
||||
Chroma, like Pygments, has the concepts of
|
||||
[lexers](https://github.com/alecthomas/chroma/tree/master/lexers),
|
||||
[formatters](https://github.com/alecthomas/chroma/tree/master/formatters) and
|
||||
[styles](https://github.com/alecthomas/chroma/tree/master/styles).
|
||||
|
||||
Lexers convert source text into a stream of tokens, styles specify how token
|
||||
types are mapped to colours, and formatters convert tokens and styles into
|
||||
formatted output.
|
||||
|
||||
A package exists for each of these, containing a global `Registry` variable
|
||||
with all of the registered implementations. There are also helper functions
|
||||
for using the registry in each package, such as looking up lexers by name or
|
||||
matching filenames, etc.
|
||||
|
||||
In all cases, if a lexer, formatter or style can not be determined, `nil` will
|
||||
be returned. In this situation you may want to default to the `Fallback`
|
||||
value in each respective package, which provides sane defaults.
|
||||
|
||||
<a id="markdown-quick-start" name="quick-start"></a>
|
||||
### Quick start
|
||||
|
||||
A convenience function exists that can be used to simply format some source
|
||||
text, without any effort:
|
||||
|
||||
```go
|
||||
err := quick.Highlight(os.Stdout, someSourceCode, "go", "html", "monokai")
|
||||
```
|
||||
|
||||
<a id="markdown-identifying-the-language" name="identifying-the-language"></a>
|
||||
### Identifying the language
|
||||
|
||||
To highlight code, you'll first have to identify what language the code is
|
||||
written in. There are three primary ways to do that:
|
||||
|
||||
1. Detect the language from its filename.
|
||||
|
||||
```go
|
||||
lexer := lexers.Match("foo.go")
|
||||
```
|
||||
|
||||
3. Explicitly specify the language by its Chroma syntax ID (a full list is available from `lexers.Names()`).
|
||||
|
||||
```go
|
||||
lexer := lexers.Get("go")
|
||||
```
|
||||
|
||||
3. Detect the language from its content.
|
||||
|
||||
```go
|
||||
lexer := lexers.Analyse("package main\n\nfunc main()\n{\n}\n")
|
||||
```
|
||||
|
||||
In all cases, `nil` will be returned if the language can not be identified.
|
||||
|
||||
```go
|
||||
if lexer == nil {
|
||||
lexer = lexers.Fallback
|
||||
}
|
||||
```
|
||||
|
||||
At this point, it should be noted that some lexers can be extremely chatty. To
|
||||
mitigate this, you can use the coalescing lexer to coalesce runs of identical
|
||||
token types into a single token:
|
||||
|
||||
```go
|
||||
lexer = chroma.Coalesce(lexer)
|
||||
```
|
||||
|
||||
<a id="markdown-formatting-the-output" name="formatting-the-output"></a>
|
||||
### Formatting the output
|
||||
|
||||
Once a language is identified you will need to pick a formatter and a style (theme).
|
||||
|
||||
```go
|
||||
style := styles.Get("swapoff")
|
||||
if style == nil {
|
||||
style = styles.Fallback
|
||||
}
|
||||
formatter := formatters.Get("html")
|
||||
if formatter == nil {
|
||||
formatter = formatters.Fallback
|
||||
}
|
||||
```
|
||||
|
||||
Then obtain an iterator over the tokens:
|
||||
|
||||
```go
|
||||
contents, err := ioutil.ReadAll(r)
|
||||
iterator, err := lexer.Tokenise(nil, string(contents))
|
||||
```
|
||||
|
||||
And finally, format the tokens from the iterator:
|
||||
|
||||
```go
|
||||
err := formatter.Format(w, style, iterator)
|
||||
```
|
||||
|
||||
<a id="markdown-the-html-formatter" name="the-html-formatter"></a>
|
||||
### The HTML formatter
|
||||
|
||||
By default the `html` registered formatter generates standalone HTML with
|
||||
embedded CSS. More flexibility is available through the `formatters/html` package.
|
||||
|
||||
Firstly, the output generated by the formatter can be customised with the
|
||||
following constructor options:
|
||||
|
||||
- `Standalone()` - generate standalone HTML with embedded CSS.
|
||||
- `WithClasses()` - use classes rather than inlined style attributes.
|
||||
- `ClassPrefix(prefix)` - prefix each generated CSS class.
|
||||
- `TabWidth(width)` - Set the rendered tab width, in characters.
|
||||
- `WithLineNumbers()` - Render line numbers (style with `LineNumbers`).
|
||||
- `LinkableLineNumbers()` - Make the line numbers linkable and be a link to themselves.
|
||||
- `HighlightLines(ranges)` - Highlight lines in these ranges (style with `LineHighlight`).
|
||||
- `LineNumbersInTable()` - Use a table for formatting line numbers and code, rather than spans.
|
||||
|
||||
If `WithClasses()` is used, the corresponding CSS can be obtained from the formatter with:
|
||||
|
||||
```go
|
||||
formatter := html.New(html.WithClasses())
|
||||
err := formatter.WriteCSS(w, style)
|
||||
```
|
||||
|
||||
<a id="markdown-more-detail" name="more-detail"></a>
|
||||
## More detail
|
||||
|
||||
<a id="markdown-lexers" name="lexers"></a>
|
||||
### Lexers
|
||||
|
||||
See the [Pygments documentation](http://pygments.org/docs/lexerdevelopment/)
|
||||
for details on implementing lexers. Most concepts apply directly to Chroma,
|
||||
but see existing lexer implementations for real examples.
|
||||
|
||||
In many cases lexers can be automatically converted directly from Pygments by
|
||||
using the included Python 3 script `pygments2chroma.py`. I use something like
|
||||
the following:
|
||||
|
||||
```sh
|
||||
python3 _tools/pygments2chroma.py \
|
||||
pygments.lexers.jvm.KotlinLexer \
|
||||
> lexers/k/kotlin.go \
|
||||
&& gofmt -s -w lexers/k/kotlin.go
|
||||
```
|
||||
|
||||
See notes in [pygments-lexers.txt](https://github.com/alecthomas/chroma/blob/master/pygments-lexers.txt)
|
||||
for a list of lexers, and notes on some of the issues importing them.
|
||||
|
||||
<a id="markdown-formatters" name="formatters"></a>
|
||||
### Formatters
|
||||
|
||||
Chroma supports HTML output, as well as terminal output in 8 colour, 256 colour, and true-colour.
|
||||
|
||||
A `noop` formatter is included that outputs the token text only, and a `tokens`
|
||||
formatter outputs raw tokens. The latter is useful for debugging lexers.
|
||||
|
||||
<a id="markdown-styles" name="styles"></a>
|
||||
### Styles
|
||||
|
||||
Chroma styles use the [same syntax](http://pygments.org/docs/styles/) as Pygments.
|
||||
|
||||
All Pygments styles have been converted to Chroma using the `_tools/style.py` script.
|
||||
|
||||
When you work with one of [Chroma's styles](https://github.com/alecthomas/chroma/tree/master/styles), know that the `chroma.Background` token type provides the default style for tokens. It does so by defining a foreground color and background color.
|
||||
|
||||
For example, this gives each token name not defined in the style a default color of `#f8f8f8` and uses `#000000` for the highlighted code block's background:
|
||||
|
||||
~~~go
|
||||
chroma.Background: "#f8f8f2 bg:#000000",
|
||||
~~~
|
||||
|
||||
Also, token types in a style file are hierarchical. For instance, when `CommentSpecial` is not defined, Chroma uses the token style from `Comment`. So when several comment tokens use the same color, you'll only need to define `Comment` and override the one that has a different color.
|
||||
|
||||
For a quick overview of the available styles and how they look, check out the [Chroma Style Gallery](https://xyproto.github.io/splash/docs/).
|
||||
|
||||
<a id="markdown-command-line-interface" name="command-line-interface"></a>
|
||||
## Command-line interface
|
||||
|
||||
A command-line interface to Chroma is included.
|
||||
|
||||
Binaries are available to install from [the releases page](https://github.com/alecthomas/chroma/releases).
|
||||
|
||||
The CLI can be used as a preprocessor to colorise output of `less(1)`,
|
||||
see documentation for the `LESSOPEN` environment variable.
|
||||
|
||||
The `--fail` flag can be used to suppress output and return with exit status
|
||||
1 to facilitate falling back to some other preprocessor in case chroma
|
||||
does not resolve a specific lexer to use for the given file. For example:
|
||||
|
||||
```shell
|
||||
export LESSOPEN='| p() { chroma --fail "$1" || cat "$1"; }; p "%s"'
|
||||
```
|
||||
|
||||
Replace `cat` with your favourite fallback preprocessor.
|
||||
|
||||
When invoked as `.lessfilter`, the `--fail` flag is automatically turned
|
||||
on under the hood for easy integration with [lesspipe shipping with
|
||||
Debian and derivatives](https://manpages.debian.org/lesspipe#USER_DEFINED_FILTERS);
|
||||
for that setup the `chroma` executable can be just symlinked to `~/.lessfilter`.
|
||||
|
||||
<a id="markdown-whats-missing-compared-to-pygments" name="whats-missing-compared-to-pygments"></a>
|
||||
## What's missing compared to Pygments?
|
||||
|
||||
- Quite a few lexers, for various reasons (pull-requests welcome):
|
||||
- Pygments lexers for complex languages often include custom code to
|
||||
handle certain aspects, such as Raku's ability to nest code inside
|
||||
regular expressions. These require time and effort to convert.
|
||||
- I mostly only converted languages I had heard of, to reduce the porting cost.
|
||||
- Some more esoteric features of Pygments are omitted for simplicity.
|
||||
- Though the Chroma API supports content detection, very few languages support them.
|
||||
I have plans to implement a statistical analyser at some point, but not enough time.
|
||||
60
vendor/github.com/alecthomas/chroma/lexers/a/abap.go
generated
vendored
60
vendor/github.com/alecthomas/chroma/lexers/a/abap.go
generated
vendored
@@ -1,60 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// ABAP lexer.
|
||||
var Abap = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ABAP",
|
||||
Aliases: []string{"abap"},
|
||||
Filenames: []string{"*.abap", "*.ABAP"},
|
||||
MimeTypes: []string{"text/x-abap"},
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
abapRules,
|
||||
))
|
||||
|
||||
func abapRules() Rules {
|
||||
return Rules{
|
||||
"common": {
|
||||
{`\s+`, Text, nil},
|
||||
{`^\*.*$`, CommentSingle, nil},
|
||||
{`\".*?\n`, CommentSingle, nil},
|
||||
{`##\w+`, CommentSpecial, nil},
|
||||
},
|
||||
"variable-names": {
|
||||
{`<\S+>`, NameVariable, nil},
|
||||
{`\w[\w~]*(?:(\[\])|->\*)?`, NameVariable, nil},
|
||||
},
|
||||
"root": {
|
||||
Include("common"),
|
||||
{`CALL\s+(?:BADI|CUSTOMER-FUNCTION|FUNCTION)`, Keyword, nil},
|
||||
{`(CALL\s+(?:DIALOG|SCREEN|SUBSCREEN|SELECTION-SCREEN|TRANSACTION|TRANSFORMATION))\b`, Keyword, nil},
|
||||
{`(FORM|PERFORM)(\s+)(\w+)`, ByGroups(Keyword, Text, NameFunction), nil},
|
||||
{`(PERFORM)(\s+)(\()(\w+)(\))`, ByGroups(Keyword, Text, Punctuation, NameVariable, Punctuation), nil},
|
||||
{`(MODULE)(\s+)(\S+)(\s+)(INPUT|OUTPUT)`, ByGroups(Keyword, Text, NameFunction, Text, Keyword), nil},
|
||||
{`(METHOD)(\s+)([\w~]+)`, ByGroups(Keyword, Text, NameFunction), nil},
|
||||
{`(\s+)([\w\-]+)([=\-]>)([\w\-~]+)`, ByGroups(Text, NameVariable, Operator, NameFunction), nil},
|
||||
{`(?<=(=|-)>)([\w\-~]+)(?=\()`, NameFunction, nil},
|
||||
{`(TEXT)(-)(\d{3})`, ByGroups(Keyword, Punctuation, LiteralNumberInteger), nil},
|
||||
{`(TEXT)(-)(\w{3})`, ByGroups(Keyword, Punctuation, NameVariable), nil},
|
||||
{`(ADD-CORRESPONDING|AUTHORITY-CHECK|CLASS-DATA|CLASS-EVENTS|CLASS-METHODS|CLASS-POOL|DELETE-ADJACENT|DIVIDE-CORRESPONDING|EDITOR-CALL|ENHANCEMENT-POINT|ENHANCEMENT-SECTION|EXIT-COMMAND|FIELD-GROUPS|FIELD-SYMBOLS|FUNCTION-POOL|INTERFACE-POOL|INVERTED-DATE|LOAD-OF-PROGRAM|LOG-POINT|MESSAGE-ID|MOVE-CORRESPONDING|MULTIPLY-CORRESPONDING|NEW-LINE|NEW-PAGE|NEW-SECTION|NO-EXTENSION|OUTPUT-LENGTH|PRINT-CONTROL|SELECT-OPTIONS|START-OF-SELECTION|SUBTRACT-CORRESPONDING|SYNTAX-CHECK|SYSTEM-EXCEPTIONS|TYPE-POOL|TYPE-POOLS|NO-DISPLAY)\b`, Keyword, nil},
|
||||
{`(?<![-\>])(CREATE\s+(PUBLIC|PRIVATE|DATA|OBJECT)|(PUBLIC|PRIVATE|PROTECTED)\s+SECTION|(TYPE|LIKE)\s+((LINE\s+OF|REF\s+TO|(SORTED|STANDARD|HASHED)\s+TABLE\s+OF))?|FROM\s+(DATABASE|MEMORY)|CALL\s+METHOD|(GROUP|ORDER) BY|HAVING|SEPARATED BY|GET\s+(BADI|BIT|CURSOR|DATASET|LOCALE|PARAMETER|PF-STATUS|(PROPERTY|REFERENCE)\s+OF|RUN\s+TIME|TIME\s+(STAMP)?)?|SET\s+(BIT|BLANK\s+LINES|COUNTRY|CURSOR|DATASET|EXTENDED\s+CHECK|HANDLER|HOLD\s+DATA|LANGUAGE|LEFT\s+SCROLL-BOUNDARY|LOCALE|MARGIN|PARAMETER|PF-STATUS|PROPERTY\s+OF|RUN\s+TIME\s+(ANALYZER|CLOCK\s+RESOLUTION)|SCREEN|TITLEBAR|UPADTE\s+TASK\s+LOCAL|USER-COMMAND)|CONVERT\s+((INVERTED-)?DATE|TIME|TIME\s+STAMP|TEXT)|(CLOSE|OPEN)\s+(DATASET|CURSOR)|(TO|FROM)\s+(DATA BUFFER|INTERNAL TABLE|MEMORY ID|DATABASE|SHARED\s+(MEMORY|BUFFER))|DESCRIBE\s+(DISTANCE\s+BETWEEN|FIELD|LIST|TABLE)|FREE\s(MEMORY|OBJECT)?|PROCESS\s+(BEFORE\s+OUTPUT|AFTER\s+INPUT|ON\s+(VALUE-REQUEST|HELP-REQUEST))|AT\s+(LINE-SELECTION|USER-COMMAND|END\s+OF|NEW)|AT\s+SELECTION-SCREEN(\s+(ON(\s+(BLOCK|(HELP|VALUE)-REQUEST\s+FOR|END\s+OF|RADIOBUTTON\s+GROUP))?|OUTPUT))?|SELECTION-SCREEN:?\s+((BEGIN|END)\s+OF\s+((TABBED\s+)?BLOCK|LINE|SCREEN)|COMMENT|FUNCTION\s+KEY|INCLUDE\s+BLOCKS|POSITION|PUSHBUTTON|SKIP|ULINE)|LEAVE\s+(LIST-PROCESSING|PROGRAM|SCREEN|TO LIST-PROCESSING|TO TRANSACTION)(ENDING|STARTING)\s+AT|FORMAT\s+(COLOR|INTENSIFIED|INVERSE|HOTSPOT|INPUT|FRAMES|RESET)|AS\s+(CHECKBOX|SUBSCREEN|WINDOW)|WITH\s+(((NON-)?UNIQUE)?\s+KEY|FRAME)|(BEGIN|END)\s+OF|DELETE(\s+ADJACENT\s+DUPLICATES\sFROM)?|COMPARING(\s+ALL\s+FIELDS)?|(INSERT|APPEND)(\s+INITIAL\s+LINE\s+(IN)?TO|\s+LINES\s+OF)?|IN\s+((BYTE|CHARACTER)\s+MODE|PROGRAM)|END-OF-(DEFINITION|PAGE|SELECTION)|WITH\s+FRAME(\s+TITLE)|(REPLACE|FIND)\s+((FIRST|ALL)\s+OCCURRENCES?\s+OF\s+)?(SUBSTRING|REGEX)?|MATCH\s+(LENGTH|COUNT|LINE|OFFSET)|(RESPECTING|IGNORING)\s+CASE|IN\s+UPDATE\s+TASK|(SOURCE|RESULT)\s+(XML)?|REFERENCE\s+INTO|AND\s+(MARK|RETURN)|CLIENT\s+SPECIFIED|CORRESPONDING\s+FIELDS\s+OF|IF\s+FOUND|FOR\s+EVENT|INHERITING\s+FROM|LEAVE\s+TO\s+SCREEN|LOOP\s+AT\s+(SCREEN)?|LOWER\s+CASE|MATCHCODE\s+OBJECT|MODIF\s+ID|MODIFY\s+SCREEN|NESTING\s+LEVEL|NO\s+INTERVALS|OF\s+STRUCTURE|RADIOBUTTON\s+GROUP|RANGE\s+OF|REF\s+TO|SUPPRESS DIALOG|TABLE\s+OF|UPPER\s+CASE|TRANSPORTING\s+NO\s+FIELDS|VALUE\s+CHECK|VISIBLE\s+LENGTH|HEADER\s+LINE|COMMON\s+PART)\b`, Keyword, nil},
|
||||
{`(^|(?<=(\s|\.)))(ABBREVIATED|ABSTRACT|ADD|ALIASES|ALIGN|ALPHA|ASSERT|AS|ASSIGN(ING)?|AT(\s+FIRST)?|BACK|BLOCK|BREAK-POINT|CASE|CATCH|CHANGING|CHECK|CLASS|CLEAR|COLLECT|COLOR|COMMIT|CREATE|COMMUNICATION|COMPONENTS?|COMPUTE|CONCATENATE|CONDENSE|CONSTANTS|CONTEXTS|CONTINUE|CONTROLS|COUNTRY|CURRENCY|DATA|DATE|DECIMALS|DEFAULT|DEFINE|DEFINITION|DEFERRED|DEMAND|DETAIL|DIRECTORY|DIVIDE|DO|DUMMY|ELSE(IF)?|ENDAT|ENDCASE|ENDCATCH|ENDCLASS|ENDDO|ENDFORM|ENDFUNCTION|ENDIF|ENDINTERFACE|ENDLOOP|ENDMETHOD|ENDMODULE|ENDSELECT|ENDTRY|ENDWHILE|ENHANCEMENT|EVENTS|EXACT|EXCEPTIONS?|EXIT|EXPONENT|EXPORT|EXPORTING|EXTRACT|FETCH|FIELDS?|FOR|FORM|FORMAT|FREE|FROM|FUNCTION|HIDE|ID|IF|IMPORT|IMPLEMENTATION|IMPORTING|IN|INCLUDE|INCLUDING|INDEX|INFOTYPES|INITIALIZATION|INTERFACE|INTERFACES|INTO|LANGUAGE|LEAVE|LENGTH|LINES|LOAD|LOCAL|JOIN|KEY|NEXT|MAXIMUM|MESSAGE|METHOD[S]?|MINIMUM|MODULE|MODIFIER|MODIFY|MOVE|MULTIPLY|NODES|NUMBER|OBLIGATORY|OBJECT|OF|OFF|ON|OTHERS|OVERLAY|PACK|PAD|PARAMETERS|PERCENTAGE|POSITION|PROGRAM|PROVIDE|PUBLIC|PUT|PF\d\d|RAISE|RAISING|RANGES?|READ|RECEIVE|REDEFINITION|REFRESH|REJECT|REPORT|RESERVE|RESUME|RETRY|RETURN|RETURNING|RIGHT|ROLLBACK|REPLACE|SCROLL|SEARCH|SELECT|SHIFT|SIGN|SINGLE|SIZE|SKIP|SORT|SPLIT|STATICS|STOP|STYLE|SUBMATCHES|SUBMIT|SUBTRACT|SUM(?!\()|SUMMARY|SUMMING|SUPPLY|TABLE|TABLES|TIMESTAMP|TIMES?|TIMEZONE|TITLE|\??TO|TOP-OF-PAGE|TRANSFER|TRANSLATE|TRY|TYPES|ULINE|UNDER|UNPACK|UPDATE|USING|VALUE|VALUES|VIA|VARYING|VARY|WAIT|WHEN|WHERE|WIDTH|WHILE|WITH|WINDOW|WRITE|XSD|ZERO)\b`, Keyword, nil},
|
||||
{`(abs|acos|asin|atan|boolc|boolx|bit_set|char_off|charlen|ceil|cmax|cmin|condense|contains|contains_any_of|contains_any_not_of|concat_lines_of|cos|cosh|count|count_any_of|count_any_not_of|dbmaxlen|distance|escape|exp|find|find_end|find_any_of|find_any_not_of|floor|frac|from_mixed|insert|lines|log|log10|match|matches|nmax|nmin|numofchar|repeat|replace|rescale|reverse|round|segment|shift_left|shift_right|sign|sin|sinh|sqrt|strlen|substring|substring_after|substring_from|substring_before|substring_to|tan|tanh|to_upper|to_lower|to_mixed|translate|trunc|xstrlen)(\()\b`, ByGroups(NameBuiltin, Punctuation), nil},
|
||||
{`&[0-9]`, Name, nil},
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`(?<=(\s|.))(AND|OR|EQ|NE|GT|LT|GE|LE|CO|CN|CA|NA|CS|NOT|NS|CP|NP|BYTE-CO|BYTE-CN|BYTE-CA|BYTE-NA|BYTE-CS|BYTE-NS|IS\s+(NOT\s+)?(INITIAL|ASSIGNED|REQUESTED|BOUND))\b`, OperatorWord, nil},
|
||||
Include("variable-names"),
|
||||
{`[?*<>=\-+&]`, Operator, nil},
|
||||
{`'(''|[^'])*'`, LiteralStringSingle, nil},
|
||||
{"`([^`])*`", LiteralStringSingle, nil},
|
||||
{`([|}])([^{}|]*?)([|{])`, ByGroups(Punctuation, LiteralStringSingle, Punctuation), nil},
|
||||
{`[/;:()\[\],.]`, Punctuation, nil},
|
||||
{`(!)(\w+)`, ByGroups(Operator, Name), nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
42
vendor/github.com/alecthomas/chroma/lexers/a/abnf.go
generated
vendored
42
vendor/github.com/alecthomas/chroma/lexers/a/abnf.go
generated
vendored
@@ -1,42 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Abnf lexer.
|
||||
var Abnf = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ABNF",
|
||||
Aliases: []string{"abnf"},
|
||||
Filenames: []string{"*.abnf"},
|
||||
MimeTypes: []string{"text/x-abnf"},
|
||||
},
|
||||
abnfRules,
|
||||
))
|
||||
|
||||
func abnfRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`;.*$`, CommentSingle, nil},
|
||||
{`(%[si])?"[^"]*"`, Literal, nil},
|
||||
{`%b[01]+\-[01]+\b`, Literal, nil},
|
||||
{`%b[01]+(\.[01]+)*\b`, Literal, nil},
|
||||
{`%d[0-9]+\-[0-9]+\b`, Literal, nil},
|
||||
{`%d[0-9]+(\.[0-9]+)*\b`, Literal, nil},
|
||||
{`%x[0-9a-fA-F]+\-[0-9a-fA-F]+\b`, Literal, nil},
|
||||
{`%x[0-9a-fA-F]+(\.[0-9a-fA-F]+)*\b`, Literal, nil},
|
||||
{`\b[0-9]+\*[0-9]+`, Operator, nil},
|
||||
{`\b[0-9]+\*`, Operator, nil},
|
||||
{`\b[0-9]+`, Operator, nil},
|
||||
{`\*`, Operator, nil},
|
||||
{Words(``, `\b`, `ALPHA`, `BIT`, `CHAR`, `CR`, `CRLF`, `CTL`, `DIGIT`, `DQUOTE`, `HEXDIG`, `HTAB`, `LF`, `LWSP`, `OCTET`, `SP`, `VCHAR`, `WSP`), Keyword, nil},
|
||||
{`[a-zA-Z][a-zA-Z0-9-]+\b`, NameClass, nil},
|
||||
{`(=/|=|/)`, Operator, nil},
|
||||
{`[\[\]()]`, Punctuation, nil},
|
||||
{`\s+`, Text, nil},
|
||||
{`.`, Text, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
43
vendor/github.com/alecthomas/chroma/lexers/a/actionscript.go
generated
vendored
43
vendor/github.com/alecthomas/chroma/lexers/a/actionscript.go
generated
vendored
@@ -1,43 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Actionscript lexer.
|
||||
var Actionscript = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ActionScript",
|
||||
Aliases: []string{"as", "actionscript"},
|
||||
Filenames: []string{"*.as"},
|
||||
MimeTypes: []string{"application/x-actionscript", "text/x-actionscript", "text/actionscript"},
|
||||
NotMultiline: true,
|
||||
DotAll: true,
|
||||
},
|
||||
actionscriptRules,
|
||||
))
|
||||
|
||||
func actionscriptRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\s+`, Text, nil},
|
||||
{`//.*?\n`, CommentSingle, nil},
|
||||
{`/\*.*?\*/`, CommentMultiline, nil},
|
||||
{`/(\\\\|\\/|[^/\n])*/[gim]*`, LiteralStringRegex, nil},
|
||||
{`[~^*!%&<>|+=:;,/?\\-]+`, Operator, nil},
|
||||
{`[{}\[\]();.]+`, Punctuation, nil},
|
||||
{Words(``, `\b`, `case`, `default`, `for`, `each`, `in`, `while`, `do`, `break`, `return`, `continue`, `if`, `else`, `throw`, `try`, `catch`, `var`, `with`, `new`, `typeof`, `arguments`, `instanceof`, `this`, `switch`), Keyword, nil},
|
||||
{Words(``, `\b`, `class`, `public`, `final`, `internal`, `native`, `override`, `private`, `protected`, `static`, `import`, `extends`, `implements`, `interface`, `intrinsic`, `return`, `super`, `dynamic`, `function`, `const`, `get`, `namespace`, `package`, `set`), KeywordDeclaration, nil},
|
||||
{`(true|false|null|NaN|Infinity|-Infinity|undefined|Void)\b`, KeywordConstant, nil},
|
||||
{Words(``, `\b`, `Accessibility`, `AccessibilityProperties`, `ActionScriptVersion`, `ActivityEvent`, `AntiAliasType`, `ApplicationDomain`, `AsBroadcaster`, `Array`, `AsyncErrorEvent`, `AVM1Movie`, `BevelFilter`, `Bitmap`, `BitmapData`, `BitmapDataChannel`, `BitmapFilter`, `BitmapFilterQuality`, `BitmapFilterType`, `BlendMode`, `BlurFilter`, `Boolean`, `ByteArray`, `Camera`, `Capabilities`, `CapsStyle`, `Class`, `Color`, `ColorMatrixFilter`, `ColorTransform`, `ContextMenu`, `ContextMenuBuiltInItems`, `ContextMenuEvent`, `ContextMenuItem`, `ConvultionFilter`, `CSMSettings`, `DataEvent`, `Date`, `DefinitionError`, `DeleteObjectSample`, `Dictionary`, `DisplacmentMapFilter`, `DisplayObject`, `DisplacmentMapFilterMode`, `DisplayObjectContainer`, `DropShadowFilter`, `Endian`, `EOFError`, `Error`, `ErrorEvent`, `EvalError`, `Event`, `EventDispatcher`, `EventPhase`, `ExternalInterface`, `FileFilter`, `FileReference`, `FileReferenceList`, `FocusDirection`, `FocusEvent`, `Font`, `FontStyle`, `FontType`, `FrameLabel`, `FullScreenEvent`, `Function`, `GlowFilter`, `GradientBevelFilter`, `GradientGlowFilter`, `GradientType`, `Graphics`, `GridFitType`, `HTTPStatusEvent`, `IBitmapDrawable`, `ID3Info`, `IDataInput`, `IDataOutput`, `IDynamicPropertyOutputIDynamicPropertyWriter`, `IEventDispatcher`, `IExternalizable`, `IllegalOperationError`, `IME`, `IMEConversionMode`, `IMEEvent`, `int`, `InteractiveObject`, `InterpolationMethod`, `InvalidSWFError`, `InvokeEvent`, `IOError`, `IOErrorEvent`, `JointStyle`, `Key`, `Keyboard`, `KeyboardEvent`, `KeyLocation`, `LineScaleMode`, `Loader`, `LoaderContext`, `LoaderInfo`, `LoadVars`, `LocalConnection`, `Locale`, `Math`, `Matrix`, `MemoryError`, `Microphone`, `MorphShape`, `Mouse`, `MouseEvent`, `MovieClip`, `MovieClipLoader`, `Namespace`, `NetConnection`, `NetStatusEvent`, `NetStream`, `NewObjectSample`, `Number`, `Object`, `ObjectEncoding`, `PixelSnapping`, `Point`, `PrintJob`, `PrintJobOptions`, `PrintJobOrientation`, `ProgressEvent`, `Proxy`, `QName`, `RangeError`, `Rectangle`, `ReferenceError`, `RegExp`, `Responder`, `Sample`, `Scene`, `ScriptTimeoutError`, `Security`, `SecurityDomain`, `SecurityError`, `SecurityErrorEvent`, `SecurityPanel`, `Selection`, `Shape`, `SharedObject`, `SharedObjectFlushStatus`, `SimpleButton`, `Socket`, `Sound`, `SoundChannel`, `SoundLoaderContext`, `SoundMixer`, `SoundTransform`, `SpreadMethod`, `Sprite`, `StackFrame`, `StackOverflowError`, `Stage`, `StageAlign`, `StageDisplayState`, `StageQuality`, `StageScaleMode`, `StaticText`, `StatusEvent`, `String`, `StyleSheet`, `SWFVersion`, `SyncEvent`, `SyntaxError`, `System`, `TextColorType`, `TextField`, `TextFieldAutoSize`, `TextFieldType`, `TextFormat`, `TextFormatAlign`, `TextLineMetrics`, `TextRenderer`, `TextSnapshot`, `Timer`, `TimerEvent`, `Transform`, `TypeError`, `uint`, `URIError`, `URLLoader`, `URLLoaderDataFormat`, `URLRequest`, `URLRequestHeader`, `URLRequestMethod`, `URLStream`, `URLVariabeles`, `VerifyError`, `Video`, `XML`, `XMLDocument`, `XMLList`, `XMLNode`, `XMLNodeType`, `XMLSocket`, `XMLUI`), NameBuiltin, nil},
|
||||
{Words(``, `\b`, `decodeURI`, `decodeURIComponent`, `encodeURI`, `escape`, `eval`, `isFinite`, `isNaN`, `isXMLName`, `clearInterval`, `fscommand`, `getTimer`, `getURL`, `getVersion`, `parseFloat`, `parseInt`, `setInterval`, `trace`, `updateAfterEvent`, `unescape`), NameFunction, nil},
|
||||
{`[$a-zA-Z_]\w*`, NameOther, nil},
|
||||
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
|
||||
{`0x[0-9a-f]+`, LiteralNumberHex, nil},
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
60
vendor/github.com/alecthomas/chroma/lexers/a/actionscript3.go
generated
vendored
60
vendor/github.com/alecthomas/chroma/lexers/a/actionscript3.go
generated
vendored
@@ -1,60 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Actionscript 3 lexer.
|
||||
var Actionscript3 = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ActionScript 3",
|
||||
Aliases: []string{"as3", "actionscript3"},
|
||||
Filenames: []string{"*.as"},
|
||||
MimeTypes: []string{"application/x-actionscript3", "text/x-actionscript3", "text/actionscript3"},
|
||||
DotAll: true,
|
||||
},
|
||||
actionscript3Rules,
|
||||
))
|
||||
|
||||
func actionscript3Rules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\s+`, Text, nil},
|
||||
{`(function\s+)([$a-zA-Z_]\w*)(\s*)(\()`, ByGroups(KeywordDeclaration, NameFunction, Text, Operator), Push("funcparams")},
|
||||
{`(var|const)(\s+)([$a-zA-Z_]\w*)(\s*)(:)(\s*)([$a-zA-Z_]\w*(?:\.<\w+>)?)`, ByGroups(KeywordDeclaration, Text, Name, Text, Punctuation, Text, KeywordType), nil},
|
||||
{`(import|package)(\s+)((?:[$a-zA-Z_]\w*|\.)+)(\s*)`, ByGroups(Keyword, Text, NameNamespace, Text), nil},
|
||||
{`(new)(\s+)([$a-zA-Z_]\w*(?:\.<\w+>)?)(\s*)(\()`, ByGroups(Keyword, Text, KeywordType, Text, Operator), nil},
|
||||
{`//.*?\n`, CommentSingle, nil},
|
||||
{`/\*.*?\*/`, CommentMultiline, nil},
|
||||
{`/(\\\\|\\/|[^\n])*/[gisx]*`, LiteralStringRegex, nil},
|
||||
{`(\.)([$a-zA-Z_]\w*)`, ByGroups(Operator, NameAttribute), nil},
|
||||
{`(case|default|for|each|in|while|do|break|return|continue|if|else|throw|try|catch|with|new|typeof|arguments|instanceof|this|switch|import|include|as|is)\b`, Keyword, nil},
|
||||
{`(class|public|final|internal|native|override|private|protected|static|import|extends|implements|interface|intrinsic|return|super|dynamic|function|const|get|namespace|package|set)\b`, KeywordDeclaration, nil},
|
||||
{`(true|false|null|NaN|Infinity|-Infinity|undefined|void)\b`, KeywordConstant, nil},
|
||||
{`(decodeURI|decodeURIComponent|encodeURI|escape|eval|isFinite|isNaN|isXMLName|clearInterval|fscommand|getTimer|getURL|getVersion|isFinite|parseFloat|parseInt|setInterval|trace|updateAfterEvent|unescape)\b`, NameFunction, nil},
|
||||
{`[$a-zA-Z_]\w*`, Name, nil},
|
||||
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
|
||||
{`0x[0-9a-f]+`, LiteralNumberHex, nil},
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
|
||||
{`[~^*!%&<>|+=:;,/?\\{}\[\]().-]+`, Operator, nil},
|
||||
},
|
||||
"funcparams": {
|
||||
{`\s+`, Text, nil},
|
||||
{`(\s*)(\.\.\.)?([$a-zA-Z_]\w*)(\s*)(:)(\s*)([$a-zA-Z_]\w*(?:\.<\w+>)?|\*)(\s*)`, ByGroups(Text, Punctuation, Name, Text, Operator, Text, KeywordType, Text), Push("defval")},
|
||||
{`\)`, Operator, Push("type")},
|
||||
},
|
||||
"type": {
|
||||
{`(\s*)(:)(\s*)([$a-zA-Z_]\w*(?:\.<\w+>)?|\*)`, ByGroups(Text, Operator, Text, KeywordType), Pop(2)},
|
||||
{`\s+`, Text, Pop(2)},
|
||||
Default(Pop(2)),
|
||||
},
|
||||
"defval": {
|
||||
{`(=)(\s*)([^(),]+)(\s*)(,?)`, ByGroups(Operator, Text, UsingSelf("root"), Text, Operator), Pop(1)},
|
||||
{`,`, Operator, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
}
|
||||
}
|
||||
118
vendor/github.com/alecthomas/chroma/lexers/a/ada.go
generated
vendored
118
vendor/github.com/alecthomas/chroma/lexers/a/ada.go
generated
vendored
@@ -1,118 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Ada lexer.
|
||||
var Ada = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Ada",
|
||||
Aliases: []string{"ada", "ada95", "ada2005"},
|
||||
Filenames: []string{"*.adb", "*.ads", "*.ada"},
|
||||
MimeTypes: []string{"text/x-ada"},
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
adaRules,
|
||||
))
|
||||
|
||||
func adaRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`--.*?\n`, CommentSingle, nil},
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`function|procedure|entry`, KeywordDeclaration, Push("subprogram")},
|
||||
{`(subtype|type)(\s+)(\w+)`, ByGroups(KeywordDeclaration, Text, KeywordType), Push("type_def")},
|
||||
{`task|protected`, KeywordDeclaration, nil},
|
||||
{`(subtype)(\s+)`, ByGroups(KeywordDeclaration, Text), nil},
|
||||
{`(end)(\s+)`, ByGroups(KeywordReserved, Text), Push("end")},
|
||||
{`(pragma)(\s+)(\w+)`, ByGroups(KeywordReserved, Text, CommentPreproc), nil},
|
||||
{`(true|false|null)\b`, KeywordConstant, nil},
|
||||
{Words(``, `\b`, `Address`, `Byte`, `Boolean`, `Character`, `Controlled`, `Count`, `Cursor`, `Duration`, `File_Mode`, `File_Type`, `Float`, `Generator`, `Integer`, `Long_Float`, `Long_Integer`, `Long_Long_Float`, `Long_Long_Integer`, `Natural`, `Positive`, `Reference_Type`, `Short_Float`, `Short_Integer`, `Short_Short_Float`, `Short_Short_Integer`, `String`, `Wide_Character`, `Wide_String`), KeywordType, nil},
|
||||
{`(and(\s+then)?|in|mod|not|or(\s+else)|rem)\b`, OperatorWord, nil},
|
||||
{`generic|private`, KeywordDeclaration, nil},
|
||||
{`package`, KeywordDeclaration, Push("package")},
|
||||
{`array\b`, KeywordReserved, Push("array_def")},
|
||||
{`(with|use)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
|
||||
{`(\w+)(\s*)(:)(\s*)(constant)`, ByGroups(NameConstant, Text, Punctuation, Text, KeywordReserved), nil},
|
||||
{`<<\w+>>`, NameLabel, nil},
|
||||
{`(\w+)(\s*)(:)(\s*)(declare|begin|loop|for|while)`, ByGroups(NameLabel, Text, Punctuation, Text, KeywordReserved), nil},
|
||||
{Words(`\b`, `\b`, `abort`, `abs`, `abstract`, `accept`, `access`, `aliased`, `all`, `array`, `at`, `begin`, `body`, `case`, `constant`, `declare`, `delay`, `delta`, `digits`, `do`, `else`, `elsif`, `end`, `entry`, `exception`, `exit`, `interface`, `for`, `goto`, `if`, `is`, `limited`, `loop`, `new`, `null`, `of`, `or`, `others`, `out`, `overriding`, `pragma`, `protected`, `raise`, `range`, `record`, `renames`, `requeue`, `return`, `reverse`, `select`, `separate`, `subtype`, `synchronized`, `task`, `tagged`, `terminate`, `then`, `type`, `until`, `when`, `while`, `xor`), KeywordReserved, nil},
|
||||
{`"[^"]*"`, LiteralString, nil},
|
||||
Include("attribute"),
|
||||
Include("numbers"),
|
||||
{`'[^']'`, LiteralStringChar, nil},
|
||||
{`(\w+)(\s*|[(,])`, ByGroups(Name, UsingSelf("root")), nil},
|
||||
{`(<>|=>|:=|[()|:;,.'])`, Punctuation, nil},
|
||||
{`[*<>+=/&-]`, Operator, nil},
|
||||
{`\n+`, Text, nil},
|
||||
},
|
||||
"numbers": {
|
||||
{`[0-9_]+#[0-9a-f]+#`, LiteralNumberHex, nil},
|
||||
{`[0-9_]+\.[0-9_]*`, LiteralNumberFloat, nil},
|
||||
{`[0-9_]+`, LiteralNumberInteger, nil},
|
||||
},
|
||||
"attribute": {
|
||||
{`(')(\w+)`, ByGroups(Punctuation, NameAttribute), nil},
|
||||
},
|
||||
"subprogram": {
|
||||
{`\(`, Punctuation, Push("#pop", "formal_part")},
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
{`is\b`, KeywordReserved, Pop(1)},
|
||||
{`"[^"]+"|\w+`, NameFunction, nil},
|
||||
Include("root"),
|
||||
},
|
||||
"end": {
|
||||
{`(if|case|record|loop|select)`, KeywordReserved, nil},
|
||||
{`"[^"]+"|[\w.]+`, NameFunction, nil},
|
||||
{`\s+`, Text, nil},
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
},
|
||||
"type_def": {
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
{`\(`, Punctuation, Push("formal_part")},
|
||||
{`with|and|use`, KeywordReserved, nil},
|
||||
{`array\b`, KeywordReserved, Push("#pop", "array_def")},
|
||||
{`record\b`, KeywordReserved, Push("record_def")},
|
||||
{`(null record)(;)`, ByGroups(KeywordReserved, Punctuation), Pop(1)},
|
||||
Include("root"),
|
||||
},
|
||||
"array_def": {
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
{`(\w+)(\s+)(range)`, ByGroups(KeywordType, Text, KeywordReserved), nil},
|
||||
Include("root"),
|
||||
},
|
||||
"record_def": {
|
||||
{`end record`, KeywordReserved, Pop(1)},
|
||||
Include("root"),
|
||||
},
|
||||
"import": {
|
||||
{`[\w.]+`, NameNamespace, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"formal_part": {
|
||||
{`\)`, Punctuation, Pop(1)},
|
||||
{`\w+`, NameVariable, nil},
|
||||
{`,|:[^=]`, Punctuation, nil},
|
||||
{`(in|not|null|out|access)\b`, KeywordReserved, nil},
|
||||
Include("root"),
|
||||
},
|
||||
"package": {
|
||||
{`body`, KeywordDeclaration, nil},
|
||||
{`is\s+new|renames`, KeywordReserved, nil},
|
||||
{`is`, KeywordReserved, Pop(1)},
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
{`\(`, Punctuation, Push("package_instantiation")},
|
||||
{`([\w.]+)`, NameClass, nil},
|
||||
Include("root"),
|
||||
},
|
||||
"package_instantiation": {
|
||||
{`("[^"]+"|\w+)(\s+)(=>)`, ByGroups(NameVariable, Text, Punctuation), nil},
|
||||
{`[\w.\'"]`, Text, nil},
|
||||
{`\)`, Punctuation, Pop(1)},
|
||||
Include("root"),
|
||||
},
|
||||
}
|
||||
}
|
||||
47
vendor/github.com/alecthomas/chroma/lexers/a/al.go
generated
vendored
47
vendor/github.com/alecthomas/chroma/lexers/a/al.go
generated
vendored
@@ -1,47 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Al lexer.
|
||||
var Al = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "AL",
|
||||
Aliases: []string{"al"},
|
||||
Filenames: []string{"*.al", "*.dal"},
|
||||
MimeTypes: []string{"text/x-al"},
|
||||
DotAll: true,
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
alRules,
|
||||
))
|
||||
|
||||
// https://github.com/microsoft/AL/blob/master/grammar/alsyntax.tmlanguage
|
||||
func alRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\s+`, TextWhitespace, nil},
|
||||
{`(?s)\/\*.*?\\*\*\/`, CommentMultiline, nil},
|
||||
{`(?s)//.*?\n`, CommentSingle, nil},
|
||||
{`\"([^\"])*\"`, Text, nil},
|
||||
{`'([^'])*'`, LiteralString, nil},
|
||||
{`\b(?i:(ARRAY|ASSERTERROR|BEGIN|BREAK|CASE|DO|DOWNTO|ELSE|END|EVENT|EXIT|FOR|FOREACH|FUNCTION|IF|IMPLEMENTS|IN|INDATASET|INTERFACE|INTERNAL|LOCAL|OF|PROCEDURE|PROGRAM|PROTECTED|REPEAT|RUNONCLIENT|SECURITYFILTERING|SUPPRESSDISPOSE|TEMPORARY|THEN|TO|TRIGGER|UNTIL|VAR|WHILE|WITH|WITHEVENTS))\b`, Keyword, nil},
|
||||
{`\b(?i:(AND|DIV|MOD|NOT|OR|XOR))\b`, OperatorWord, nil},
|
||||
{`\b(?i:(AVERAGE|CONST|COUNT|EXIST|FIELD|FILTER|LOOKUP|MAX|MIN|ORDER|SORTING|SUM|TABLEDATA|UPPERLIMIT|WHERE|ASCENDING|DESCENDING))\b`, Keyword, nil},
|
||||
{`\b(?i:(CODEUNIT|PAGE|PAGEEXTENSION|PAGECUSTOMIZATION|DOTNET|ENUM|ENUMEXTENSION|VALUE|QUERY|REPORT|TABLE|TABLEEXTENSION|XMLPORT|PROFILE|CONTROLADDIN|REPORTEXTENSION|INTERFACE|PERMISSIONSET|PERMISSIONSETEXTENSION|ENTITLEMENT))\b`, Keyword, nil},
|
||||
{`\b(?i:(Action|Array|Automation|BigInteger|BigText|Blob|Boolean|Byte|Char|ClientType|Code|Codeunit|CompletionTriggerErrorLevel|ConnectionType|Database|DataClassification|DataScope|Date|DateFormula|DateTime|Decimal|DefaultLayout|Dialog|Dictionary|DotNet|DotNetAssembly|DotNetTypeDeclaration|Duration|Enum|ErrorInfo|ErrorType|ExecutionContext|ExecutionMode|FieldClass|FieldRef|FieldType|File|FilterPageBuilder|Guid|InStream|Integer|Joker|KeyRef|List|ModuleDependencyInfo|ModuleInfo|None|Notification|NotificationScope|ObjectType|Option|OutStream|Page|PageResult|Query|Record|RecordId|RecordRef|Report|ReportFormat|SecurityFilter|SecurityFiltering|Table|TableConnectionType|TableFilter|TestAction|TestField|TestFilterField|TestPage|TestPermissions|TestRequestPage|Text|TextBuilder|TextConst|TextEncoding|Time|TransactionModel|TransactionType|Variant|Verbosity|Version|XmlPort|HttpContent|HttpHeaders|HttpClient|HttpRequestMessage|HttpResponseMessage|JsonToken|JsonValue|JsonArray|JsonObject|View|Views|XmlAttribute|XmlAttributeCollection|XmlComment|XmlCData|XmlDeclaration|XmlDocument|XmlDocumentType|XmlElement|XmlNamespaceManager|XmlNameTable|XmlNode|XmlNodeList|XmlProcessingInstruction|XmlReadOptions|XmlText|XmlWriteOptions|WebServiceActionContext|WebServiceActionResultCode|SessionSettings))\b`, Keyword, nil},
|
||||
{`\b([<>]=|<>|<|>)\b?`, Operator, nil},
|
||||
{`\b(\-|\+|\/|\*)\b`, Operator, nil},
|
||||
{`\s*(\:=|\+=|-=|\/=|\*=)\s*?`, Operator, nil},
|
||||
{`\b(?i:(ADD|ADDFIRST|ADDLAST|ADDAFTER|ADDBEFORE|ACTION|ACTIONS|AREA|ASSEMBLY|CHARTPART|CUEGROUP|CUSTOMIZES|COLUMN|DATAITEM|DATASET|ELEMENTS|EXTENDS|FIELD|FIELDGROUP|FIELDATTRIBUTE|FIELDELEMENT|FIELDGROUPS|FIELDS|FILTER|FIXED|GRID|GROUP|MOVEAFTER|MOVEBEFORE|KEY|KEYS|LABEL|LABELS|LAYOUT|MODIFY|MOVEFIRST|MOVELAST|MOVEBEFORE|MOVEAFTER|PART|REPEATER|USERCONTROL|REQUESTPAGE|SCHEMA|SEPARATOR|SYSTEMPART|TABLEELEMENT|TEXTATTRIBUTE|TEXTELEMENT|TYPE))\b`, Keyword, nil},
|
||||
{`\s*[(\.\.)&\|]\s*`, Operator, nil},
|
||||
{`\b((0(x|X)[0-9a-fA-F]*)|(([0-9]+\.?[0-9]*)|(\.[0-9]+))((e|E)(\+|-)?[0-9]+)?)(L|l|UL|ul|u|U|F|f|ll|LL|ull|ULL)?\b`, LiteralNumber, nil},
|
||||
{`[;:,]`, Punctuation, nil},
|
||||
{`#[ \t]*(if|else|elif|endif|define|undef|region|endregion|pragma)\b.*?\n`, CommentPreproc, nil},
|
||||
{`\w+`, Text, nil},
|
||||
{`.`, Text, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
46
vendor/github.com/alecthomas/chroma/lexers/a/angular2.go
generated
vendored
46
vendor/github.com/alecthomas/chroma/lexers/a/angular2.go
generated
vendored
@@ -1,46 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Angular2 lexer.
|
||||
var Angular2 = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Angular2",
|
||||
Aliases: []string{"ng2"},
|
||||
Filenames: []string{},
|
||||
MimeTypes: []string{},
|
||||
},
|
||||
angular2Rules,
|
||||
))
|
||||
|
||||
func angular2Rules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`[^{([*#]+`, Other, nil},
|
||||
{`(\{\{)(\s*)`, ByGroups(CommentPreproc, Text), Push("ngExpression")},
|
||||
{`([([]+)([\w:.-]+)([\])]+)(\s*)(=)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation, Text, Operator, Text), Push("attr")},
|
||||
{`([([]+)([\w:.-]+)([\])]+)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation, Text), nil},
|
||||
{`([*#])([\w:.-]+)(\s*)(=)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation, Operator), Push("attr")},
|
||||
{`([*#])([\w:.-]+)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation), nil},
|
||||
},
|
||||
"ngExpression": {
|
||||
{`\s+(\|\s+)?`, Text, nil},
|
||||
{`\}\}`, CommentPreproc, Pop(1)},
|
||||
{`:?(true|false)`, LiteralStringBoolean, nil},
|
||||
{`:?"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`:?'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
|
||||
{`[0-9](\.[0-9]*)?(eE[+-][0-9])?[flFLdD]?|0[xX][0-9a-fA-F]+[Ll]?`, LiteralNumber, nil},
|
||||
{`[a-zA-Z][\w-]*(\(.*\))?`, NameVariable, nil},
|
||||
{`\.[\w-]+(\(.*\))?`, NameVariable, nil},
|
||||
{`(\?)(\s*)([^}\s]+)(\s*)(:)(\s*)([^}\s]+)(\s*)`, ByGroups(Operator, Text, LiteralString, Text, Operator, Text, LiteralString, Text), nil},
|
||||
},
|
||||
"attr": {
|
||||
{`".*?"`, LiteralString, Pop(1)},
|
||||
{`'.*?'`, LiteralString, Pop(1)},
|
||||
{`[^\s>]+`, LiteralString, Pop(1)},
|
||||
},
|
||||
}
|
||||
}
|
||||
105
vendor/github.com/alecthomas/chroma/lexers/a/antlr.go
generated
vendored
105
vendor/github.com/alecthomas/chroma/lexers/a/antlr.go
generated
vendored
@@ -1,105 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// ANTLR lexer.
|
||||
var ANTLR = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ANTLR",
|
||||
Aliases: []string{"antlr"},
|
||||
Filenames: []string{},
|
||||
MimeTypes: []string{},
|
||||
},
|
||||
antlrRules,
|
||||
))
|
||||
|
||||
func antlrRules() Rules {
|
||||
return Rules{
|
||||
"whitespace": {
|
||||
{`\s+`, TextWhitespace, nil},
|
||||
},
|
||||
"comments": {
|
||||
{`//.*$`, Comment, nil},
|
||||
{`/\*(.|\n)*?\*/`, Comment, nil},
|
||||
},
|
||||
"root": {
|
||||
Include("whitespace"),
|
||||
Include("comments"),
|
||||
{`(lexer|parser|tree)?(\s*)(grammar\b)(\s*)([A-Za-z]\w*)(;)`, ByGroups(Keyword, TextWhitespace, Keyword, TextWhitespace, NameClass, Punctuation), nil},
|
||||
{`options\b`, Keyword, Push("options")},
|
||||
{`tokens\b`, Keyword, Push("tokens")},
|
||||
{`(scope)(\s*)([A-Za-z]\w*)(\s*)(\{)`, ByGroups(Keyword, TextWhitespace, NameVariable, TextWhitespace, Punctuation), Push("action")},
|
||||
{`(catch|finally)\b`, Keyword, Push("exception")},
|
||||
{`(@[A-Za-z]\w*)(\s*)(::)?(\s*)([A-Za-z]\w*)(\s*)(\{)`, ByGroups(NameLabel, TextWhitespace, Punctuation, TextWhitespace, NameLabel, TextWhitespace, Punctuation), Push("action")},
|
||||
{`((?:protected|private|public|fragment)\b)?(\s*)([A-Za-z]\w*)(!)?`, ByGroups(Keyword, TextWhitespace, NameLabel, Punctuation), Push("rule-alts", "rule-prelims")},
|
||||
},
|
||||
"exception": {
|
||||
{`\n`, TextWhitespace, Pop(1)},
|
||||
{`\s`, TextWhitespace, nil},
|
||||
Include("comments"),
|
||||
{`\[`, Punctuation, Push("nested-arg-action")},
|
||||
{`\{`, Punctuation, Push("action")},
|
||||
},
|
||||
"rule-prelims": {
|
||||
Include("whitespace"),
|
||||
Include("comments"),
|
||||
{`returns\b`, Keyword, nil},
|
||||
{`\[`, Punctuation, Push("nested-arg-action")},
|
||||
{`\{`, Punctuation, Push("action")},
|
||||
{`(throws)(\s+)([A-Za-z]\w*)`, ByGroups(Keyword, TextWhitespace, NameLabel), nil},
|
||||
{`(,)(\s*)([A-Za-z]\w*)`, ByGroups(Punctuation, TextWhitespace, NameLabel), nil},
|
||||
{`options\b`, Keyword, Push("options")},
|
||||
{`(scope)(\s+)(\{)`, ByGroups(Keyword, TextWhitespace, Punctuation), Push("action")},
|
||||
{`(scope)(\s+)([A-Za-z]\w*)(\s*)(;)`, ByGroups(Keyword, TextWhitespace, NameLabel, TextWhitespace, Punctuation), nil},
|
||||
{`(@[A-Za-z]\w*)(\s*)(\{)`, ByGroups(NameLabel, TextWhitespace, Punctuation), Push("action")},
|
||||
{`:`, Punctuation, Pop(1)},
|
||||
},
|
||||
"rule-alts": {
|
||||
Include("whitespace"),
|
||||
Include("comments"),
|
||||
{`options\b`, Keyword, Push("options")},
|
||||
{`:`, Punctuation, nil},
|
||||
{`'(\\\\|\\'|[^'])*'`, LiteralString, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
{`<<([^>]|>[^>])>>`, LiteralString, nil},
|
||||
{`\$?[A-Z_]\w*`, NameConstant, nil},
|
||||
{`\$?[a-z_]\w*`, NameVariable, nil},
|
||||
{`(\+|\||->|=>|=|\(|\)|\.\.|\.|\?|\*|\^|!|\#|~)`, Operator, nil},
|
||||
{`,`, Punctuation, nil},
|
||||
{`\[`, Punctuation, Push("nested-arg-action")},
|
||||
{`\{`, Punctuation, Push("action")},
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
},
|
||||
"tokens": {
|
||||
Include("whitespace"),
|
||||
Include("comments"),
|
||||
{`\{`, Punctuation, nil},
|
||||
{`([A-Z]\w*)(\s*)(=)?(\s*)(\'(?:\\\\|\\\'|[^\']*)\')?(\s*)(;)`, ByGroups(NameLabel, TextWhitespace, Punctuation, TextWhitespace, LiteralString, TextWhitespace, Punctuation), nil},
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
},
|
||||
"options": {
|
||||
Include("whitespace"),
|
||||
Include("comments"),
|
||||
{`\{`, Punctuation, nil},
|
||||
{`([A-Za-z]\w*)(\s*)(=)(\s*)([A-Za-z]\w*|\'(?:\\\\|\\\'|[^\']*)\'|[0-9]+|\*)(\s*)(;)`, ByGroups(NameVariable, TextWhitespace, Punctuation, TextWhitespace, Text, TextWhitespace, Punctuation), nil},
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
},
|
||||
"action": {
|
||||
{`([^${}\'"/\\]+|"(\\\\|\\"|[^"])*"|'(\\\\|\\'|[^'])*'|//.*$\n?|/\*(.|\n)*?\*/|/(?!\*)(\\\\|\\/|[^/])*/|\\(?!%)|/)+`, Other, nil},
|
||||
{`(\\)(%)`, ByGroups(Punctuation, Other), nil},
|
||||
{`(\$[a-zA-Z]+)(\.?)(text|value)?`, ByGroups(NameVariable, Punctuation, NameProperty), nil},
|
||||
{`\{`, Punctuation, Push()},
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
},
|
||||
"nested-arg-action": {
|
||||
{`([^$\[\]\'"/]+|"(\\\\|\\"|[^"])*"|'(\\\\|\\'|[^'])*'|//.*$\n?|/\*(.|\n)*?\*/|/(?!\*)(\\\\|\\/|[^/])*/|/)+`, Other, nil},
|
||||
{`\[`, Punctuation, Push()},
|
||||
{`\]`, Punctuation, Pop(1)},
|
||||
{`(\$[a-zA-Z]+)(\.?)(text|value)?`, ByGroups(NameVariable, Punctuation, NameProperty), nil},
|
||||
{`(\\\\|\\\]|\\\[|[^\[\]])+`, Other, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
42
vendor/github.com/alecthomas/chroma/lexers/a/apache.go
generated
vendored
42
vendor/github.com/alecthomas/chroma/lexers/a/apache.go
generated
vendored
@@ -1,42 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Apacheconf lexer.
|
||||
var Apacheconf = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ApacheConf",
|
||||
Aliases: []string{"apacheconf", "aconf", "apache"},
|
||||
Filenames: []string{".htaccess", "apache.conf", "apache2.conf"},
|
||||
MimeTypes: []string{"text/x-apacheconf"},
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
apacheconfRules,
|
||||
))
|
||||
|
||||
func apacheconfRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\s+`, Text, nil},
|
||||
{`(#.*?)$`, Comment, nil},
|
||||
{`(<[^\s>]+)(?:(\s+)(.*?))?(>)`, ByGroups(NameTag, Text, LiteralString, NameTag), nil},
|
||||
{`([a-z]\w*)(\s+)`, ByGroups(NameBuiltin, Text), Push("value")},
|
||||
{`\.+`, Text, nil},
|
||||
},
|
||||
"value": {
|
||||
{`\\\n`, Text, nil},
|
||||
{`$`, Text, Pop(1)},
|
||||
{`\\`, Text, nil},
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`\d+\.\d+\.\d+\.\d+(?:/\d+)?`, LiteralNumber, nil},
|
||||
{`\d+`, LiteralNumber, nil},
|
||||
{`/([a-z0-9][\w./-]+)`, LiteralStringOther, nil},
|
||||
{`(on|off|none|any|all|double|email|dns|min|minimal|os|productonly|full|emerg|alert|crit|error|warn|notice|info|debug|registry|script|inetd|standalone|user|group)\b`, Keyword, nil},
|
||||
{`"([^"\\]*(?:\\.[^"\\]*)*)"`, LiteralStringDouble, nil},
|
||||
{`[^\s"\\]+`, Text, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
40
vendor/github.com/alecthomas/chroma/lexers/a/apl.go
generated
vendored
40
vendor/github.com/alecthomas/chroma/lexers/a/apl.go
generated
vendored
@@ -1,40 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Apl lexer.
|
||||
var Apl = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "APL",
|
||||
Aliases: []string{"apl"},
|
||||
Filenames: []string{"*.apl"},
|
||||
MimeTypes: []string{},
|
||||
},
|
||||
aplRules,
|
||||
))
|
||||
|
||||
func aplRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\s+`, Text, nil},
|
||||
{`[⍝#].*$`, CommentSingle, nil},
|
||||
{`\'((\'\')|[^\'])*\'`, LiteralStringSingle, nil},
|
||||
{`"(("")|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`[⋄◇()]`, Punctuation, nil},
|
||||
{`[\[\];]`, LiteralStringRegex, nil},
|
||||
{`⎕[A-Za-zΔ∆⍙][A-Za-zΔ∆⍙_¯0-9]*`, NameFunction, nil},
|
||||
{`[A-Za-zΔ∆⍙_][A-Za-zΔ∆⍙_¯0-9]*`, NameVariable, nil},
|
||||
{`¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞)([Jj]¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞))?`, LiteralNumber, nil},
|
||||
{`[\.\\/⌿⍀¨⍣⍨⍠⍤∘⍥@⌺⌶⍢]`, NameAttribute, nil},
|
||||
{`[+\-×÷⌈⌊∣|⍳?*⍟○!⌹<≤=>≥≠≡≢∊⍷∪∩~∨∧⍱⍲⍴,⍪⌽⊖⍉↑↓⊂⊃⌷⍋⍒⊤⊥⍕⍎⊣⊢⍁⍂≈⌸⍯↗⊆⍸]`, Operator, nil},
|
||||
{`⍬`, NameConstant, nil},
|
||||
{`[⎕⍞]`, NameVariableGlobal, nil},
|
||||
{`[←→]`, KeywordDeclaration, nil},
|
||||
{`[⍺⍵⍶⍹∇:]`, NameBuiltinPseudo, nil},
|
||||
{`[{}]`, KeywordType, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
59
vendor/github.com/alecthomas/chroma/lexers/a/applescript.go
generated
vendored
59
vendor/github.com/alecthomas/chroma/lexers/a/applescript.go
generated
vendored
@@ -1,59 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Applescript lexer.
|
||||
var Applescript = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "AppleScript",
|
||||
Aliases: []string{"applescript"},
|
||||
Filenames: []string{"*.applescript"},
|
||||
MimeTypes: []string{},
|
||||
DotAll: true,
|
||||
},
|
||||
applescriptRules,
|
||||
))
|
||||
|
||||
func applescriptRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\s+`, Text, nil},
|
||||
{`¬\n`, LiteralStringEscape, nil},
|
||||
{`'s\s+`, Text, nil},
|
||||
{`(--|#).*?$`, Comment, nil},
|
||||
{`\(\*`, CommentMultiline, Push("comment")},
|
||||
{`[(){}!,.:]`, Punctuation, nil},
|
||||
{`(«)([^»]+)(»)`, ByGroups(Text, NameBuiltin, Text), nil},
|
||||
{`\b((?:considering|ignoring)\s*)(application responses|case|diacriticals|hyphens|numeric strings|punctuation|white space)`, ByGroups(Keyword, NameBuiltin), nil},
|
||||
{`(-|\*|\+|&|≠|>=?|<=?|=|≥|≤|/|÷|\^)`, Operator, nil},
|
||||
{`\b(and|or|is equal|equals|(is )?equal to|is not|isn't|isn't equal( to)?|is not equal( to)?|doesn't equal|does not equal|(is )?greater than|comes after|is not less than or equal( to)?|isn't less than or equal( to)?|(is )?less than|comes before|is not greater than or equal( to)?|isn't greater than or equal( to)?|(is )?greater than or equal( to)?|is not less than|isn't less than|does not come before|doesn't come before|(is )?less than or equal( to)?|is not greater than|isn't greater than|does not come after|doesn't come after|starts? with|begins? with|ends? with|contains?|does not contain|doesn't contain|is in|is contained by|is not in|is not contained by|isn't contained by|div|mod|not|(a )?(ref( to)?|reference to)|is|does)\b`, OperatorWord, nil},
|
||||
{`^(\s*(?:on|end)\s+)(zoomed|write to file|will zoom|will show|will select tab view item|will resize( sub views)?|will resign active|will quit|will pop up|will open|will move|will miniaturize|will hide|will finish launching|will display outline cell|will display item cell|will display cell|will display browser cell|will dismiss|will close|will become active|was miniaturized|was hidden|update toolbar item|update parameters|update menu item|shown|should zoom|should selection change|should select tab view item|should select row|should select item|should select column|should quit( after last window closed)?|should open( untitled)?|should expand item|should end editing|should collapse item|should close|should begin editing|selection changing|selection changed|selected tab view item|scroll wheel|rows changed|right mouse up|right mouse dragged|right mouse down|resized( sub views)?|resigned main|resigned key|resigned active|read from file|prepare table drop|prepare table drag|prepare outline drop|prepare outline drag|prepare drop|plugin loaded|parameters updated|panel ended|opened|open untitled|number of rows|number of items|number of browser rows|moved|mouse up|mouse moved|mouse exited|mouse entered|mouse dragged|mouse down|miniaturized|load data representation|launched|keyboard up|keyboard down|items changed|item value changed|item value|item expandable|idle|exposed|end editing|drop|drag( (entered|exited|updated))?|double clicked|document nib name|dialog ended|deminiaturized|data representation|conclude drop|column resized|column moved|column clicked|closed|clicked toolbar item|clicked|choose menu item|child of item|changed|change item value|change cell value|cell value changed|cell value|bounds changed|begin editing|became main|became key|awake from nib|alert ended|activated|action|accept table drop|accept outline drop)`, ByGroups(Keyword, NameFunction), nil},
|
||||
{`^(\s*)(in|on|script|to)(\s+)`, ByGroups(Text, Keyword, Text), nil},
|
||||
{`\b(as )(alias |application |boolean |class |constant |date |file |integer |list |number |POSIX file |real |record |reference |RGB color |script |text |unit types|(?:Unicode )?text|string)\b`, ByGroups(Keyword, NameClass), nil},
|
||||
{`\b(AppleScript|current application|false|linefeed|missing value|pi|quote|result|return|space|tab|text item delimiters|true|version)\b`, NameConstant, nil},
|
||||
{`\b(ASCII (character|number)|activate|beep|choose URL|choose application|choose color|choose file( name)?|choose folder|choose from list|choose remote application|clipboard info|close( access)?|copy|count|current date|delay|delete|display (alert|dialog)|do shell script|duplicate|exists|get eof|get volume settings|info for|launch|list (disks|folder)|load script|log|make|mount volume|new|offset|open( (for access|location))?|path to|print|quit|random number|read|round|run( script)?|say|scripting components|set (eof|the clipboard to|volume)|store script|summarize|system attribute|system info|the clipboard|time to GMT|write|quoted form)\b`, NameBuiltin, nil},
|
||||
{`\b(considering|else|error|exit|from|if|ignoring|in|repeat|tell|then|times|to|try|until|using terms from|while|with|with timeout( of)?|with transaction|by|continue|end|its?|me|my|return|of|as)\b`, Keyword, nil},
|
||||
{`\b(global|local|prop(erty)?|set|get)\b`, Keyword, nil},
|
||||
{`\b(but|put|returning|the)\b`, NameBuiltin, nil},
|
||||
{`\b(attachment|attribute run|character|day|month|paragraph|word|year)s?\b`, NameBuiltin, nil},
|
||||
{`\b(about|above|against|apart from|around|aside from|at|below|beneath|beside|between|for|given|instead of|on|onto|out of|over|since)\b`, NameBuiltin, nil},
|
||||
{`\b(accepts arrow key|action method|active|alignment|allowed identifiers|allows branch selection|allows column reordering|allows column resizing|allows column selection|allows customization|allows editing text attributes|allows empty selection|allows mixed state|allows multiple selection|allows reordering|allows undo|alpha( value)?|alternate image|alternate increment value|alternate title|animation delay|associated file name|associated object|auto completes|auto display|auto enables items|auto repeat|auto resizes( outline column)?|auto save expanded items|auto save name|auto save table columns|auto saves configuration|auto scroll|auto sizes all columns to fit|auto sizes cells|background color|bezel state|bezel style|bezeled|border rect|border type|bordered|bounds( rotation)?|box type|button returned|button type|can choose directories|can choose files|can draw|can hide|cell( (background color|size|type))?|characters|class|click count|clicked( data)? column|clicked data item|clicked( data)? row|closeable|collating|color( (mode|panel))|command key down|configuration|content(s| (size|view( margins)?))?|context|continuous|control key down|control size|control tint|control view|controller visible|coordinate system|copies( on scroll)?|corner view|current cell|current column|current( field)? editor|current( menu)? item|current row|current tab view item|data source|default identifiers|delta (x|y|z)|destination window|directory|display mode|displayed cell|document( (edited|rect|view))?|double value|dragged column|dragged distance|dragged items|draws( cell)? background|draws grid|dynamically scrolls|echos bullets|edge|editable|edited( data)? column|edited data item|edited( data)? row|enabled|enclosing scroll view|ending page|error handling|event number|event type|excluded from windows menu|executable path|expanded|fax number|field editor|file kind|file name|file type|first responder|first visible column|flipped|floating|font( panel)?|formatter|frameworks path|frontmost|gave up|grid color|has data items|has horizontal ruler|has horizontal scroller|has parent data item|has resize indicator|has shadow|has sub menu|has vertical ruler|has vertical scroller|header cell|header view|hidden|hides when deactivated|highlights by|horizontal line scroll|horizontal page scroll|horizontal ruler view|horizontally resizable|icon image|id|identifier|ignores multiple clicks|image( (alignment|dims when disabled|frame style|scaling))?|imports graphics|increment value|indentation per level|indeterminate|index|integer value|intercell spacing|item height|key( (code|equivalent( modifier)?|window))?|knob thickness|label|last( visible)? column|leading offset|leaf|level|line scroll|loaded|localized sort|location|loop mode|main( (bunde|menu|window))?|marker follows cell|matrix mode|maximum( content)? size|maximum visible columns|menu( form representation)?|miniaturizable|miniaturized|minimized image|minimized title|minimum column width|minimum( content)? size|modal|modified|mouse down state|movie( (controller|file|rect))?|muted|name|needs display|next state|next text|number of tick marks|only tick mark values|opaque|open panel|option key down|outline table column|page scroll|pages across|pages down|palette label|pane splitter|parent data item|parent window|pasteboard|path( (names|separator))?|playing|plays every frame|plays selection only|position|preferred edge|preferred type|pressure|previous text|prompt|properties|prototype cell|pulls down|rate|released when closed|repeated|requested print time|required file type|resizable|resized column|resource path|returns records|reuses columns|rich text|roll over|row height|rulers visible|save panel|scripts path|scrollable|selectable( identifiers)?|selected cell|selected( data)? columns?|selected data items?|selected( data)? rows?|selected item identifier|selection by rect|send action on arrow key|sends action when done editing|separates columns|separator item|sequence number|services menu|shared frameworks path|shared support path|sheet|shift key down|shows alpha|shows state by|size( mode)?|smart insert delete enabled|sort case sensitivity|sort column|sort order|sort type|sorted( data rows)?|sound|source( mask)?|spell checking enabled|starting page|state|string value|sub menu|super menu|super view|tab key traverses cells|tab state|tab type|tab view|table view|tag|target( printer)?|text color|text container insert|text container origin|text returned|tick mark position|time stamp|title(d| (cell|font|height|position|rect))?|tool tip|toolbar|trailing offset|transparent|treat packages as directories|truncated labels|types|unmodified characters|update views|use sort indicator|user defaults|uses data source|uses ruler|uses threaded animation|uses title from previous column|value wraps|version|vertical( (line scroll|page scroll|ruler view))?|vertically resizable|view|visible( document rect)?|volume|width|window|windows menu|wraps|zoomable|zoomed)\b`, NameAttribute, nil},
|
||||
{`\b(action cell|alert reply|application|box|browser( cell)?|bundle|button( cell)?|cell|clip view|color well|color-panel|combo box( item)?|control|data( (cell|column|item|row|source))?|default entry|dialog reply|document|drag info|drawer|event|font(-panel)?|formatter|image( (cell|view))?|matrix|menu( item)?|item|movie( view)?|open-panel|outline view|panel|pasteboard|plugin|popup button|progress indicator|responder|save-panel|scroll view|secure text field( cell)?|slider|sound|split view|stepper|tab view( item)?|table( (column|header cell|header view|view))|text( (field( cell)?|view))?|toolbar( item)?|user-defaults|view|window)s?\b`, NameBuiltin, nil},
|
||||
{`\b(animate|append|call method|center|close drawer|close panel|display|display alert|display dialog|display panel|go|hide|highlight|increment|item for|load image|load movie|load nib|load panel|load sound|localized string|lock focus|log|open drawer|path for|pause|perform action|play|register|resume|scroll|select( all)?|show|size to fit|start|step back|step forward|stop|synchronize|unlock focus|update)\b`, NameBuiltin, nil},
|
||||
{`\b((in )?back of|(in )?front of|[0-9]+(st|nd|rd|th)|first|second|third|fourth|fifth|sixth|seventh|eighth|ninth|tenth|after|back|before|behind|every|front|index|last|middle|some|that|through|thru|where|whose)\b`, NameBuiltin, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`\b([a-zA-Z]\w*)\b`, NameVariable, nil},
|
||||
{`[-+]?(\d+\.\d*|\d*\.\d+)(E[-+][0-9]+)?`, LiteralNumberFloat, nil},
|
||||
{`[-+]?\d+`, LiteralNumberInteger, nil},
|
||||
},
|
||||
"comment": {
|
||||
{`\(\*`, CommentMultiline, Push()},
|
||||
{`\*\)`, CommentMultiline, Pop(1)},
|
||||
{`[^*(]+`, CommentMultiline, nil},
|
||||
{`[*(]`, CommentMultiline, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
114
vendor/github.com/alecthomas/chroma/lexers/a/arduino.go
generated
vendored
114
vendor/github.com/alecthomas/chroma/lexers/a/arduino.go
generated
vendored
@@ -1,114 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Arduino lexer.
|
||||
var Arduino = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Arduino",
|
||||
Aliases: []string{"arduino"},
|
||||
Filenames: []string{"*.ino"},
|
||||
MimeTypes: []string{"text/x-arduino"},
|
||||
EnsureNL: true,
|
||||
},
|
||||
arduinoRules,
|
||||
))
|
||||
|
||||
func arduinoRules() Rules {
|
||||
return Rules{
|
||||
"statements": {
|
||||
{Words(``, `\b`, `catch`, `const_cast`, `delete`, `dynamic_cast`, `explicit`, `export`, `friend`, `mutable`, `namespace`, `new`, `operator`, `private`, `protected`, `public`, `reinterpret_cast`, `restrict`, `static_cast`, `template`, `this`, `throw`, `throws`, `try`, `typeid`, `typename`, `using`, `virtual`, `constexpr`, `nullptr`, `decltype`, `thread_local`, `alignas`, `alignof`, `static_assert`, `noexcept`, `override`, `final`), Keyword, nil},
|
||||
{`char(16_t|32_t)\b`, KeywordType, nil},
|
||||
{`(class)\b`, ByGroups(Keyword, Text), Push("classname")},
|
||||
{`(R)(")([^\\()\s]{,16})(\()((?:.|\n)*?)(\)\3)(")`, ByGroups(LiteralStringAffix, LiteralString, LiteralStringDelimiter, LiteralStringDelimiter, LiteralString, LiteralStringDelimiter, LiteralString), nil},
|
||||
{`(u8|u|U)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
|
||||
{`(L?)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
|
||||
{`(L?)(')(\\.|\\[0-7]{1,3}|\\x[a-fA-F0-9]{1,2}|[^\\\'\n])(')`, ByGroups(LiteralStringAffix, LiteralStringChar, LiteralStringChar, LiteralStringChar), nil},
|
||||
{`(\d+\.\d*|\.\d+|\d+)[eE][+-]?\d+[LlUu]*`, LiteralNumberFloat, nil},
|
||||
{`(\d+\.\d*|\.\d+|\d+[fF])[fF]?`, LiteralNumberFloat, nil},
|
||||
{`0x[0-9a-fA-F]+[LlUu]*`, LiteralNumberHex, nil},
|
||||
{`0[0-7]+[LlUu]*`, LiteralNumberOct, nil},
|
||||
{`\d+[LlUu]*`, LiteralNumberInteger, nil},
|
||||
{`\*/`, Error, nil},
|
||||
{`[~!%^&*+=|?:<>/-]`, Operator, nil},
|
||||
{`[()\[\],.]`, Punctuation, nil},
|
||||
{Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil},
|
||||
{`(_Bool|_Complex|_Imaginary|array|atomic_bool|atomic_char|atomic_int|atomic_llong|atomic_long|atomic_schar|atomic_short|atomic_uchar|atomic_uint|atomic_ullong|atomic_ulong|atomic_ushort|auto|bool|boolean|BooleanVariables|Byte|byte|Char|char|char16_t|char32_t|class|complex|Const|const|const_cast|delete|double|dynamic_cast|enum|explicit|extern|Float|float|friend|inline|Int|int|int16_t|int32_t|int64_t|int8_t|Long|long|new|NULL|null|operator|private|PROGMEM|protected|public|register|reinterpret_cast|short|signed|sizeof|Static|static|static_cast|String|struct|typedef|uint16_t|uint32_t|uint64_t|uint8_t|union|unsigned|virtual|Void|void|Volatile|volatile|word)\b`, KeywordType, nil},
|
||||
// Start of: Arduino-specific syntax
|
||||
{`(and|final|If|Loop|loop|not|or|override|setup|Setup|throw|try|xor)\b`, Keyword, nil}, // Addition to keywords already defined by C++
|
||||
{`(ANALOG_MESSAGE|BIN|CHANGE|DEC|DEFAULT|DIGITAL_MESSAGE|EXTERNAL|FALLING|FIRMATA_STRING|HALF_PI|HEX|HIGH|INPUT|INPUT_PULLUP|INTERNAL|INTERNAL1V1|INTERNAL1V1|INTERNAL2V56|INTERNAL2V56|LED_BUILTIN|LED_BUILTIN_RX|LED_BUILTIN_TX|LOW|LSBFIRST|MSBFIRST|OCT|OUTPUT|PI|REPORT_ANALOG|REPORT_DIGITAL|RISING|SET_PIN_MODE|SYSEX_START|SYSTEM_RESET|TWO_PI)\b`, KeywordConstant, nil},
|
||||
{`(boolean|const|byte|word|string|String|array)\b`, NameVariable, nil},
|
||||
{`(Keyboard|KeyboardController|MouseController|SoftwareSerial|EthernetServer|EthernetClient|LiquidCrystal|RobotControl|GSMVoiceCall|EthernetUDP|EsploraTFT|HttpClient|RobotMotor|WiFiClient|GSMScanner|FileSystem|Scheduler|GSMServer|YunClient|YunServer|IPAddress|GSMClient|GSMModem|Keyboard|Ethernet|Console|GSMBand|Esplora|Stepper|Process|WiFiUDP|GSM_SMS|Mailbox|USBHost|Firmata|PImage|Client|Server|GSMPIN|FileIO|Bridge|Serial|EEPROM|Stream|Mouse|Audio|Servo|File|Task|GPRS|WiFi|Wire|TFT|GSM|SPI|SD)\b`, NameClass, nil},
|
||||
{`(abs|Abs|accept|ACos|acos|acosf|addParameter|analogRead|AnalogRead|analogReadResolution|AnalogReadResolution|analogReference|AnalogReference|analogWrite|AnalogWrite|analogWriteResolution|AnalogWriteResolution|answerCall|asin|ASin|asinf|atan|ATan|atan2|ATan2|atan2f|atanf|attach|attached|attachGPRS|attachInterrupt|AttachInterrupt|autoscroll|available|availableForWrite|background|beep|begin|beginPacket|beginSD|beginSMS|beginSpeaker|beginTFT|beginTransmission|beginWrite|bit|Bit|BitClear|bitClear|bitRead|BitRead|bitSet|BitSet|BitWrite|bitWrite|blink|blinkVersion|BSSID|buffer|byte|cbrt|cbrtf|Ceil|ceil|ceilf|changePIN|char|charAt|checkPIN|checkPUK|checkReg|circle|cityNameRead|cityNameWrite|clear|clearScreen|click|close|compareTo|compassRead|concat|config|connect|connected|constrain|Constrain|copysign|copysignf|cos|Cos|cosf|cosh|coshf|countryNameRead|countryNameWrite|createChar|cursor|debugPrint|degrees|Delay|delay|DelayMicroseconds|delayMicroseconds|detach|DetachInterrupt|detachInterrupt|DigitalPinToInterrupt|digitalPinToInterrupt|DigitalRead|digitalRead|DigitalWrite|digitalWrite|disconnect|display|displayLogos|drawBMP|drawCompass|encryptionType|end|endPacket|endSMS|endsWith|endTransmission|endWrite|equals|equalsIgnoreCase|exists|exitValue|Exp|exp|expf|fabs|fabsf|fdim|fdimf|fill|find|findUntil|float|floor|Floor|floorf|flush|fma|fmaf|fmax|fmaxf|fmin|fminf|fmod|fmodf|gatewayIP|get|getAsynchronously|getBand|getButton|getBytes|getCurrentCarrier|getIMEI|getKey|getModifiers|getOemKey|getPINUsed|getResult|getSignalStrength|getSocket|getVoiceCallStatus|getXChange|getYChange|hangCall|height|highByte|HighByte|home|hypot|hypotf|image|indexOf|int|interrupts|IPAddress|IRread|isActionDone|isAlpha|isAlphaNumeric|isAscii|isControl|isDigit|isDirectory|isfinite|isGraph|isHexadecimalDigit|isinf|isListening|isLowerCase|isnan|isPIN|isPressed|isPrintable|isPunct|isSpace|isUpperCase|isValid|isWhitespace|keyboardRead|keyPressed|keyReleased|knobRead|lastIndexOf|ldexp|ldexpf|leftToRight|length|line|lineFollowConfig|listen|listenOnLocalhost|loadImage|localIP|log|Log|log10|log10f|logf|long|lowByte|LowByte|lrint|lrintf|lround|lroundf|macAddress|maintain|map|Map|Max|max|messageAvailable|Micros|micros|millis|Millis|Min|min|mkdir|motorsStop|motorsWrite|mouseDragged|mouseMoved|mousePressed|mouseReleased|move|noAutoscroll|noBlink|noBuffer|noCursor|noDisplay|noFill|noInterrupts|NoInterrupts|noListenOnLocalhost|noStroke|noTone|NoTone|onReceive|onRequest|open|openNextFile|overflow|parseCommand|parseFloat|parseInt|parsePacket|pauseMode|peek|PinMode|pinMode|playFile|playMelody|point|pointTo|position|Pow|pow|powf|prepare|press|print|printFirmwareVersion|println|printVersion|process|processInput|PulseIn|pulseIn|pulseInLong|PulseInLong|put|radians|random|Random|randomSeed|RandomSeed|read|readAccelerometer|readBlue|readButton|readBytes|readBytesUntil|readGreen|readJoystickButton|readJoystickSwitch|readJoystickX|readJoystickY|readLightSensor|readMessage|readMicrophone|readNetworks|readRed|readSlider|readString|readStringUntil|readTemperature|ready|rect|release|releaseAll|remoteIP|remoteNumber|remotePort|remove|replace|requestFrom|retrieveCallingNumber|rewindDirectory|rightToLeft|rmdir|robotNameRead|robotNameWrite|round|roundf|RSSI|run|runAsynchronously|running|runShellCommand|runShellCommandAsynchronously|scanNetworks|scrollDisplayLeft|scrollDisplayRight|seek|sendAnalog|sendDigitalPortPair|sendDigitalPorts|sendString|sendSysex|Serial_Available|Serial_Begin|Serial_End|Serial_Flush|Serial_Peek|Serial_Print|Serial_Println|Serial_Read|serialEvent|setBand|setBitOrder|setCharAt|setClockDivider|setCursor|setDataMode|setDNS|setFirmwareVersion|setMode|setPINUsed|setSpeed|setTextSize|setTimeout|ShiftIn|shiftIn|ShiftOut|shiftOut|shutdown|signbit|sin|Sin|sinf|sinh|sinhf|size|sizeof|Sq|sq|Sqrt|sqrt|sqrtf|SSID|startLoop|startsWith|step|stop|stroke|subnetMask|substring|switchPIN|tan|Tan|tanf|tanh|tanhf|tempoWrite|text|toCharArray|toInt|toLowerCase|tone|Tone|toUpperCase|transfer|trim|trunc|truncf|tuneWrite|turn|updateIR|userNameRead|userNameWrite|voiceCall|waitContinue|width|WiFiServer|word|write|writeBlue|writeGreen|writeJSON|writeMessage|writeMicroseconds|writeRed|writeRGB|yield|Yield)\b`, NameFunction, nil},
|
||||
// End of: Arduino-specific syntax
|
||||
{Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil},
|
||||
{`(__m(128i|128d|128|64))\b`, KeywordReserved, nil},
|
||||
{Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `wchar_t`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil},
|
||||
{`(true|false|NULL)\b`, NameBuiltin, nil},
|
||||
{`([a-zA-Z_]\w*)(\s*)(:)(?!:)`, ByGroups(NameLabel, Text, Punctuation), nil},
|
||||
{`[a-zA-Z_]\w*`, Name, nil},
|
||||
},
|
||||
"root": {
|
||||
Include("whitespace"),
|
||||
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;{]*)(\{)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), Push("function")},
|
||||
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;]*)(;)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), nil},
|
||||
Default(Push("statement")),
|
||||
{Words(`__`, `\b`, `virtual_inheritance`, `uuidof`, `super`, `single_inheritance`, `multiple_inheritance`, `interface`, `event`), KeywordReserved, nil},
|
||||
{`__(offload|blockingoffload|outer)\b`, KeywordPseudo, nil},
|
||||
},
|
||||
"classname": {
|
||||
{`[a-zA-Z_]\w*`, NameClass, Pop(1)},
|
||||
{`\s*(?=>)`, Text, Pop(1)},
|
||||
},
|
||||
"whitespace": {
|
||||
{`^#if\s+0`, CommentPreproc, Push("if0")},
|
||||
{`^#`, CommentPreproc, Push("macro")},
|
||||
{`^(\s*(?:/[*].*?[*]/\s*)?)(#if\s+0)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("if0")},
|
||||
{`^(\s*(?:/[*].*?[*]/\s*)?)(#)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("macro")},
|
||||
{`\n`, Text, nil},
|
||||
{`\s+`, Text, nil},
|
||||
{`\\\n`, Text, nil},
|
||||
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
|
||||
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
|
||||
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
|
||||
},
|
||||
"statement": {
|
||||
Include("whitespace"),
|
||||
Include("statements"),
|
||||
{`[{}]`, Punctuation, nil},
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
},
|
||||
"function": {
|
||||
Include("whitespace"),
|
||||
Include("statements"),
|
||||
{`;`, Punctuation, nil},
|
||||
{`\{`, Punctuation, Push()},
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
},
|
||||
"string": {
|
||||
{`"`, LiteralString, Pop(1)},
|
||||
{`\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|[0-7]{1,3})`, LiteralStringEscape, nil},
|
||||
{`[^\\"\n]+`, LiteralString, nil},
|
||||
{`\\\n`, LiteralString, nil},
|
||||
{`\\`, LiteralString, nil},
|
||||
},
|
||||
"macro": {
|
||||
{`(include)(\s*(?:/[*].*?[*]/\s*)?)([^\n]+)`, ByGroups(CommentPreproc, Text, CommentPreprocFile), nil},
|
||||
{`[^/\n]+`, CommentPreproc, nil},
|
||||
{`/[*](.|\n)*?[*]/`, CommentMultiline, nil},
|
||||
{`//.*?\n`, CommentSingle, Pop(1)},
|
||||
{`/`, CommentPreproc, nil},
|
||||
{`(?<=\\)\n`, CommentPreproc, nil},
|
||||
{`\n`, CommentPreproc, Pop(1)},
|
||||
},
|
||||
"if0": {
|
||||
{`^\s*#if.*?(?<!\\)\n`, CommentPreproc, Push()},
|
||||
{`^\s*#el(?:se|if).*\n`, CommentPreproc, Pop(1)},
|
||||
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
|
||||
{`.*?\n`, Comment, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
72
vendor/github.com/alecthomas/chroma/lexers/a/armasm.go
generated
vendored
72
vendor/github.com/alecthomas/chroma/lexers/a/armasm.go
generated
vendored
@@ -1,72 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
var ArmAsm = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "ArmAsm",
|
||||
Aliases: []string{"armasm"},
|
||||
EnsureNL: true,
|
||||
Filenames: []string{"*.s", "*.S"},
|
||||
MimeTypes: []string{"text/x-armasm", "text/x-asm"},
|
||||
},
|
||||
armasmRules,
|
||||
))
|
||||
|
||||
func armasmRules() Rules {
|
||||
return Rules{
|
||||
"commentsandwhitespace": {
|
||||
{`\s+`, Text, nil},
|
||||
{`[@;].*?\n`, CommentSingle, nil},
|
||||
{`/\*.*?\*/`, CommentMultiline, nil},
|
||||
},
|
||||
"literal": {
|
||||
// Binary
|
||||
{`0b[01]+`, NumberBin, Pop(1)},
|
||||
// Hex
|
||||
{`0x\w{1,8}`, NumberHex, Pop(1)},
|
||||
// Octal
|
||||
{`0\d+`, NumberOct, Pop(1)},
|
||||
// Float
|
||||
{`\d+?\.\d+?`, NumberFloat, Pop(1)},
|
||||
// Integer
|
||||
{`\d+`, NumberInteger, Pop(1)},
|
||||
// String
|
||||
{`(")(.+)(")`, ByGroups(Punctuation, StringDouble, Punctuation), Pop(1)},
|
||||
// Char
|
||||
{`(')(.{1}|\\.{1})(')`, ByGroups(Punctuation, StringChar, Punctuation), Pop(1)},
|
||||
},
|
||||
"opcode": {
|
||||
// Escape at line end
|
||||
{`\n`, Text, Pop(1)},
|
||||
// Comment
|
||||
{`(@|;).*\n`, CommentSingle, Pop(1)},
|
||||
// Whitespace
|
||||
{`(\s+|,)`, Text, nil},
|
||||
// Register by number
|
||||
{`[rapcfxwbhsdqv]\d{1,2}`, NameClass, nil},
|
||||
// Address by hex
|
||||
{`=0x\w+`, ByGroups(Text, NameLabel), nil},
|
||||
// Pseudo address by label
|
||||
{`(=)(\w+)`, ByGroups(Text, NameLabel), nil},
|
||||
// Immediate
|
||||
{`#`, Text, Push("literal")},
|
||||
},
|
||||
"root": {
|
||||
Include("commentsandwhitespace"),
|
||||
// Directive with optional param
|
||||
{`(\.\w+)([ \t]+\w+\s+?)?`, ByGroups(KeywordNamespace, NameLabel), nil},
|
||||
// Label with data
|
||||
{`(\w+)(:)(\s+\.\w+\s+)`, ByGroups(NameLabel, Punctuation, KeywordNamespace), Push("literal")},
|
||||
// Label
|
||||
{`(\w+)(:)`, ByGroups(NameLabel, Punctuation), nil},
|
||||
// Syscall Op
|
||||
{`svc\s+\w+`, NameNamespace, nil},
|
||||
// Opcode
|
||||
{`[a-zA-Z]+`, Text, Push("opcode")},
|
||||
},
|
||||
}
|
||||
}
|
||||
52
vendor/github.com/alecthomas/chroma/lexers/a/awk.go
generated
vendored
52
vendor/github.com/alecthomas/chroma/lexers/a/awk.go
generated
vendored
@@ -1,52 +0,0 @@
|
||||
package a
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Awk lexer.
|
||||
var Awk = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Awk",
|
||||
Aliases: []string{"awk", "gawk", "mawk", "nawk"},
|
||||
Filenames: []string{"*.awk"},
|
||||
MimeTypes: []string{"application/x-awk"},
|
||||
},
|
||||
awkRules,
|
||||
))
|
||||
|
||||
func awkRules() Rules {
|
||||
return Rules{
|
||||
"commentsandwhitespace": {
|
||||
{`\s+`, Text, nil},
|
||||
{`#.*$`, CommentSingle, nil},
|
||||
},
|
||||
"slashstartsregex": {
|
||||
Include("commentsandwhitespace"),
|
||||
{`/(\\.|[^[/\\\n]|\[(\\.|[^\]\\\n])*])+/\B`, LiteralStringRegex, Pop(1)},
|
||||
{`(?=/)`, Text, Push("#pop", "badregex")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"badregex": {
|
||||
{`\n`, Text, Pop(1)},
|
||||
},
|
||||
"root": {
|
||||
{`^(?=\s|/)`, Text, Push("slashstartsregex")},
|
||||
Include("commentsandwhitespace"),
|
||||
{`\+\+|--|\|\||&&|in\b|\$|!?~|\|&|(\*\*|[-<>+*%\^/!=|])=?`, Operator, Push("slashstartsregex")},
|
||||
{`[{(\[;,]`, Punctuation, Push("slashstartsregex")},
|
||||
{`[})\].]`, Punctuation, nil},
|
||||
{`(break|continue|do|while|exit|for|if|else|return|switch|case|default)\b`, Keyword, Push("slashstartsregex")},
|
||||
{`function\b`, KeywordDeclaration, Push("slashstartsregex")},
|
||||
{`(atan2|cos|exp|int|log|rand|sin|sqrt|srand|gensub|gsub|index|length|match|split|patsplit|sprintf|sub|substr|tolower|toupper|close|fflush|getline|next(file)|print|printf|strftime|systime|mktime|delete|system|strtonum|and|compl|lshift|or|rshift|asorti?|isarray|bindtextdomain|dcn?gettext|@(include|load|namespace))\b`, KeywordReserved, nil},
|
||||
{`(ARGC|ARGIND|ARGV|BEGIN(FILE)?|BINMODE|CONVFMT|ENVIRON|END(FILE)?|ERRNO|FIELDWIDTHS|FILENAME|FNR|FPAT|FS|IGNORECASE|LINT|NF|NR|OFMT|OFS|ORS|PROCINFO|RLENGTH|RS|RSTART|RT|SUBSEP|TEXTDOMAIN)\b`, NameBuiltin, nil},
|
||||
{`[@$a-zA-Z_]\w*`, NameOther, nil},
|
||||
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
|
||||
{`0x[0-9a-fA-F]+`, LiteralNumberHex, nil},
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
|
||||
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
50
vendor/github.com/alecthomas/chroma/lexers/b/ballerina.go
generated
vendored
50
vendor/github.com/alecthomas/chroma/lexers/b/ballerina.go
generated
vendored
@@ -1,50 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Ballerina lexer.
|
||||
var Ballerina = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Ballerina",
|
||||
Aliases: []string{"ballerina"},
|
||||
Filenames: []string{"*.bal"},
|
||||
MimeTypes: []string{"text/x-ballerina"},
|
||||
DotAll: true,
|
||||
},
|
||||
ballerinaRules,
|
||||
))
|
||||
|
||||
func ballerinaRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`[^\S\n]+`, Text, nil},
|
||||
{`//.*?\n`, CommentSingle, nil},
|
||||
{`/\*.*?\*/`, CommentMultiline, nil},
|
||||
{`(break|catch|continue|done|else|finally|foreach|forever|fork|if|lock|match|return|throw|transaction|try|while)\b`, Keyword, nil},
|
||||
{`((?:(?:[^\W\d]|\$)[\w.\[\]$<>]*\s+)+?)((?:[^\W\d]|\$)[\w$]*)(\s*)(\()`, ByGroups(UsingSelf("root"), NameFunction, Text, Operator), nil},
|
||||
{`@[^\W\d][\w.]*`, NameDecorator, nil},
|
||||
{`(annotation|bind|but|endpoint|error|function|object|private|public|returns|service|type|var|with|worker)\b`, KeywordDeclaration, nil},
|
||||
{`(boolean|byte|decimal|float|int|json|map|nil|record|string|table|xml)\b`, KeywordType, nil},
|
||||
{`(true|false|null)\b`, KeywordConstant, nil},
|
||||
{`(import)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
{`'\\.'|'[^\\]'|'\\u[0-9a-fA-F]{4}'`, LiteralStringChar, nil},
|
||||
{`(\.)((?:[^\W\d]|\$)[\w$]*)`, ByGroups(Operator, NameAttribute), nil},
|
||||
{`^\s*([^\W\d]|\$)[\w$]*:`, NameLabel, nil},
|
||||
{`([^\W\d]|\$)[\w$]*`, Name, nil},
|
||||
{`([0-9][0-9_]*\.([0-9][0-9_]*)?|\.[0-9][0-9_]*)([eE][+\-]?[0-9][0-9_]*)?[fFdD]?|[0-9][eE][+\-]?[0-9][0-9_]*[fFdD]?|[0-9]([eE][+\-]?[0-9][0-9_]*)?[fFdD]|0[xX]([0-9a-fA-F][0-9a-fA-F_]*\.?|([0-9a-fA-F][0-9a-fA-F_]*)?\.[0-9a-fA-F][0-9a-fA-F_]*)[pP][+\-]?[0-9][0-9_]*[fFdD]?`, LiteralNumberFloat, nil},
|
||||
{`0[xX][0-9a-fA-F][0-9a-fA-F_]*[lL]?`, LiteralNumberHex, nil},
|
||||
{`0[bB][01][01_]*[lL]?`, LiteralNumberBin, nil},
|
||||
{`0[0-7_]+[lL]?`, LiteralNumberOct, nil},
|
||||
{`0|[1-9][0-9_]*[lL]?`, LiteralNumberInteger, nil},
|
||||
{`[~^*!%&\[\](){}<>|+=:;,./?-]`, Operator, nil},
|
||||
{`\n`, Text, nil},
|
||||
},
|
||||
"import": {
|
||||
{`[\w.]+`, NameNamespace, Pop(1)},
|
||||
},
|
||||
}
|
||||
}
|
||||
100
vendor/github.com/alecthomas/chroma/lexers/b/bash.go
generated
vendored
100
vendor/github.com/alecthomas/chroma/lexers/b/bash.go
generated
vendored
@@ -1,100 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
"regexp"
|
||||
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// TODO(moorereason): can this be factored away?
|
||||
var bashAnalyserRe = regexp.MustCompile(`(?m)^#!.*/bin/(?:env |)(?:bash|zsh|sh|ksh)`)
|
||||
|
||||
// Bash lexer.
|
||||
var Bash = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Bash",
|
||||
Aliases: []string{"bash", "sh", "ksh", "zsh", "shell"},
|
||||
Filenames: []string{"*.sh", "*.ksh", "*.bash", "*.ebuild", "*.eclass", ".env", "*.env", "*.exheres-0", "*.exlib", "*.zsh", "*.zshrc", ".bashrc", "bashrc", ".bash_*", "bash_*", "zshrc", ".zshrc", "PKGBUILD"},
|
||||
MimeTypes: []string{"application/x-sh", "application/x-shellscript"},
|
||||
},
|
||||
bashRules,
|
||||
).SetAnalyser(func(text string) float32 {
|
||||
if bashAnalyserRe.FindString(text) != "" {
|
||||
return 1.0
|
||||
}
|
||||
return 0.0
|
||||
}))
|
||||
|
||||
func bashRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
Include("basic"),
|
||||
{"`", LiteralStringBacktick, Push("backticks")},
|
||||
Include("data"),
|
||||
Include("interp"),
|
||||
},
|
||||
"interp": {
|
||||
{`\$\(\(`, Keyword, Push("math")},
|
||||
{`\$\(`, Keyword, Push("paren")},
|
||||
{`\$\{#?`, LiteralStringInterpol, Push("curly")},
|
||||
{`\$[a-zA-Z_]\w*`, NameVariable, nil},
|
||||
{`\$(?:\d+|[#$?!_*@-])`, NameVariable, nil},
|
||||
{`\$`, Text, nil},
|
||||
},
|
||||
"basic": {
|
||||
{`\b(if|fi|else|while|do|done|for|then|return|function|case|select|continue|until|esac|elif)(\s*)\b`, ByGroups(Keyword, Text), nil},
|
||||
{"\\b(alias|bg|bind|break|builtin|caller|cd|command|compgen|complete|declare|dirs|disown|echo|enable|eval|exec|exit|export|false|fc|fg|getopts|hash|help|history|jobs|kill|let|local|logout|popd|printf|pushd|pwd|read|readonly|set|shift|shopt|source|suspend|test|time|times|trap|true|type|typeset|ulimit|umask|unalias|unset|wait)(?=[\\s)`])", NameBuiltin, nil},
|
||||
{`\A#!.+\n`, CommentPreproc, nil},
|
||||
{`#.*(\S|$)`, CommentSingle, nil},
|
||||
{`\\[\w\W]`, LiteralStringEscape, nil},
|
||||
{`(\b\w+)(\s*)(\+?=)`, ByGroups(NameVariable, Text, Operator), nil},
|
||||
{`[\[\]{}()=]`, Operator, nil},
|
||||
{`<<<`, Operator, nil},
|
||||
{`<<-?\s*(\'?)\\?(\w+)[\w\W]+?\2`, LiteralString, nil},
|
||||
{`&&|\|\|`, Operator, nil},
|
||||
},
|
||||
"data": {
|
||||
{`(?s)\$?"(\\\\|\\[0-7]+|\\.|[^"\\$])*"`, LiteralStringDouble, nil},
|
||||
{`"`, LiteralStringDouble, Push("string")},
|
||||
{`(?s)\$'(\\\\|\\[0-7]+|\\.|[^'\\])*'`, LiteralStringSingle, nil},
|
||||
{`(?s)'.*?'`, LiteralStringSingle, nil},
|
||||
{`;`, Punctuation, nil},
|
||||
{`&`, Punctuation, nil},
|
||||
{`\|`, Punctuation, nil},
|
||||
{`\s+`, Text, nil},
|
||||
{`\d+(?= |$)`, LiteralNumber, nil},
|
||||
{"[^=\\s\\[\\]{}()$\"\\'`\\\\<&|;]+", Text, nil},
|
||||
{`<`, Text, nil},
|
||||
},
|
||||
"string": {
|
||||
{`"`, LiteralStringDouble, Pop(1)},
|
||||
{`(?s)(\\\\|\\[0-7]+|\\.|[^"\\$])+`, LiteralStringDouble, nil},
|
||||
Include("interp"),
|
||||
},
|
||||
"curly": {
|
||||
{`\}`, LiteralStringInterpol, Pop(1)},
|
||||
{`:-`, Keyword, nil},
|
||||
{`\w+`, NameVariable, nil},
|
||||
{"[^}:\"\\'`$\\\\]+", Punctuation, nil},
|
||||
{`:`, Punctuation, nil},
|
||||
Include("root"),
|
||||
},
|
||||
"paren": {
|
||||
{`\)`, Keyword, Pop(1)},
|
||||
Include("root"),
|
||||
},
|
||||
"math": {
|
||||
{`\)\)`, Keyword, Pop(1)},
|
||||
{`[-+*/%^|&]|\*\*|\|\|`, Operator, nil},
|
||||
{`\d+#\d+`, LiteralNumber, nil},
|
||||
{`\d+#(?! )`, LiteralNumber, nil},
|
||||
{`\d+`, LiteralNumber, nil},
|
||||
Include("root"),
|
||||
},
|
||||
"backticks": {
|
||||
{"`", LiteralStringBacktick, Pop(1)},
|
||||
Include("root"),
|
||||
},
|
||||
}
|
||||
}
|
||||
27
vendor/github.com/alecthomas/chroma/lexers/b/bashsession.go
generated
vendored
27
vendor/github.com/alecthomas/chroma/lexers/b/bashsession.go
generated
vendored
@@ -1,27 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// BashSession lexer.
|
||||
var BashSession = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "BashSession",
|
||||
Aliases: []string{"bash-session", "console", "shell-session"},
|
||||
Filenames: []string{".sh-session"},
|
||||
MimeTypes: []string{"text/x-sh"},
|
||||
EnsureNL: true,
|
||||
},
|
||||
bashsessionRules,
|
||||
))
|
||||
|
||||
func bashsessionRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`^((?:\[[^]]+@[^]]+\]\s?)?[#$%>])(\s*)(.*\n?)`, ByGroups(GenericPrompt, Text, Using(Bash)), nil},
|
||||
{`^.+\n?`, GenericOutput, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
198
vendor/github.com/alecthomas/chroma/lexers/b/batch.go
generated
vendored
198
vendor/github.com/alecthomas/chroma/lexers/b/batch.go
generated
vendored
@@ -1,198 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Batchfile lexer.
|
||||
var Batchfile = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Batchfile",
|
||||
Aliases: []string{"bat", "batch", "dosbatch", "winbatch"},
|
||||
Filenames: []string{"*.bat", "*.cmd"},
|
||||
MimeTypes: []string{"application/x-dos-batch"},
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
batchfileRules,
|
||||
))
|
||||
|
||||
func batchfileRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`\)((?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:[^\n\x1a^]|\^[\n\x1a]?[\w\W])*)`, CommentSingle, nil},
|
||||
{`(?=((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:))`, Text, Push("follow")},
|
||||
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
|
||||
Include("redirect"),
|
||||
{`[\n\x1a]+`, Text, nil},
|
||||
{`\(`, Punctuation, Push("root/compound")},
|
||||
{`@+`, Punctuation, nil},
|
||||
{`((?:for|if|rem)(?:(?=(?:\^[\n\x1a]?)?/)|(?:(?!\^)|(?<=m))(?:(?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+)?(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?)`, ByGroups(Keyword, UsingSelf("text")), Push("follow")},
|
||||
{`(goto(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|])*(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|])*)`, ByGroups(Keyword, UsingSelf("text")), Push("follow")},
|
||||
{Words(``, `(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])`, `assoc`, `break`, `cd`, `chdir`, `cls`, `color`, `copy`, `date`, `del`, `dir`, `dpath`, `echo`, `endlocal`, `erase`, `exit`, `ftype`, `keys`, `md`, `mkdir`, `mklink`, `move`, `path`, `pause`, `popd`, `prompt`, `pushd`, `rd`, `ren`, `rename`, `rmdir`, `setlocal`, `shift`, `start`, `time`, `title`, `type`, `ver`, `verify`, `vol`), Keyword, Push("follow")},
|
||||
{`(call)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("call")},
|
||||
{`call(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])`, Keyword, nil},
|
||||
{`(for(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/f(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/f", "for")},
|
||||
{`(for(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/l(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/l", "for")},
|
||||
{`for(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])(?!\^)`, Keyword, Push("for2", "for")},
|
||||
{`(goto(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:?)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("label")},
|
||||
{`(if(?:(?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:/i(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:not(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), Keyword, UsingSelf("text")), Push("(?", "if")},
|
||||
{`rem(((?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+)?.*|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])(?:(?:[^\n\x1a^]|\^[\n\x1a]?[\w\W])*))`, CommentSingle, Push("follow")},
|
||||
{`(set(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:\^[\n\x1a]?)?[^\S\n])*)(/a)`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("arithmetic")},
|
||||
{`(set(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:/p)?)((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|^=]|\^[\n\x1a]?[^"=])+)?)((?:(?:\^[\n\x1a]?)?=)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), UsingSelf("variable"), Punctuation), Push("follow")},
|
||||
Default(Push("follow")),
|
||||
},
|
||||
"follow": {
|
||||
{`((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:)([\t\v\f\r ,;=\xa0]*)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^]|\^[\n\x1a]?[\w\W])*))(.*)`, ByGroups(Text, Punctuation, Text, NameLabel, CommentSingle), nil},
|
||||
Include("redirect"),
|
||||
{`(?=[\n\x1a])`, Text, Pop(1)},
|
||||
{`\|\|?|&&?`, Punctuation, Pop(1)},
|
||||
Include("text"),
|
||||
},
|
||||
"arithmetic": {
|
||||
{`0[0-7]+`, LiteralNumberOct, nil},
|
||||
{`0x[\da-f]+`, LiteralNumberHex, nil},
|
||||
{`\d+`, LiteralNumberInteger, nil},
|
||||
{`[(),]+`, Punctuation, nil},
|
||||
{`([=+\-*/!~]|%|\^\^)+`, Operator, nil},
|
||||
{`((?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(\^[\n\x1a]?)?[^()=+\-*/!~%^"\n\x1a&<>|\t\v\f\r ,;=\xa0]|\^[\n\x1a\t\v\f\r ,;=\xa0]?[\w\W])+`, UsingSelf("variable"), nil},
|
||||
{`(?=[\x00|&])`, Text, Pop(1)},
|
||||
Include("follow"),
|
||||
},
|
||||
"call": {
|
||||
{`(:?)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^]|\^[\n\x1a]?[\w\W])*))`, ByGroups(Punctuation, NameLabel), Pop(1)},
|
||||
},
|
||||
"label": {
|
||||
{`((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^]|\^[\n\x1a]?[\w\W])*)?)((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|\^[\n\x1a]?[\w\W]|[^"%^\n\x1a&<>|])*)`, ByGroups(NameLabel, CommentSingle), Pop(1)},
|
||||
},
|
||||
"redirect": {
|
||||
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])\d)?)(>>?&|<&)([\n\x1a\t\v\f\r ,;=\xa0]*)(\d)`, ByGroups(LiteralNumberInteger, Punctuation, Text, LiteralNumberInteger), nil},
|
||||
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])(?<!\^[\n\x1a])\d)?)(>>?|<)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(LiteralNumberInteger, Punctuation, UsingSelf("text")), nil},
|
||||
},
|
||||
"root/compound": {
|
||||
{`\)`, Punctuation, Pop(1)},
|
||||
{`(?=((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:))`, Text, Push("follow/compound")},
|
||||
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
|
||||
Include("redirect/compound"),
|
||||
{`[\n\x1a]+`, Text, nil},
|
||||
{`\(`, Punctuation, Push("root/compound")},
|
||||
{`@+`, Punctuation, nil},
|
||||
{`((?:for|if|rem)(?:(?=(?:\^[\n\x1a]?)?/)|(?:(?!\^)|(?<=m))(?:(?=\()|(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0)])+)?(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?)`, ByGroups(Keyword, UsingSelf("text")), Push("follow/compound")},
|
||||
{`(goto(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|)])*(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|)])*)`, ByGroups(Keyword, UsingSelf("text")), Push("follow/compound")},
|
||||
{Words(``, `(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))`, `assoc`, `break`, `cd`, `chdir`, `cls`, `color`, `copy`, `date`, `del`, `dir`, `dpath`, `echo`, `endlocal`, `erase`, `exit`, `ftype`, `keys`, `md`, `mkdir`, `mklink`, `move`, `path`, `pause`, `popd`, `prompt`, `pushd`, `rd`, `ren`, `rename`, `rmdir`, `setlocal`, `shift`, `start`, `time`, `title`, `type`, `ver`, `verify`, `vol`), Keyword, Push("follow/compound")},
|
||||
{`(call)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("call/compound")},
|
||||
{`call(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))`, Keyword, nil},
|
||||
{`(for(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/f(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/f", "for")},
|
||||
{`(for(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/l(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/l", "for")},
|
||||
{`for(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^)`, Keyword, Push("for2", "for")},
|
||||
{`(goto(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:?)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("label/compound")},
|
||||
{`(if(?:(?=\()|(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:/i(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:not(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), Keyword, UsingSelf("text")), Push("(?", "if")},
|
||||
{`rem(((?=\()|(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+)?.*|(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))(?:(?:[^\n\x1a^)]|\^[\n\x1a]?[^)])*))`, CommentSingle, Push("follow/compound")},
|
||||
{`(set(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:\^[\n\x1a]?)?[^\S\n])*)(/a)`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("arithmetic/compound")},
|
||||
{`(set(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:/p)?)((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|^=)]|\^[\n\x1a]?[^"=])+)?)((?:(?:\^[\n\x1a]?)?=)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), UsingSelf("variable"), Punctuation), Push("follow/compound")},
|
||||
Default(Push("follow/compound")),
|
||||
},
|
||||
"follow/compound": {
|
||||
{`(?=\))`, Text, Pop(1)},
|
||||
{`((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:)([\t\v\f\r ,;=\xa0]*)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^)]|\^[\n\x1a]?[^)])*))(.*)`, ByGroups(Text, Punctuation, Text, NameLabel, CommentSingle), nil},
|
||||
Include("redirect/compound"),
|
||||
{`(?=[\n\x1a])`, Text, Pop(1)},
|
||||
{`\|\|?|&&?`, Punctuation, Pop(1)},
|
||||
Include("text"),
|
||||
},
|
||||
"arithmetic/compound": {
|
||||
{`(?=\))`, Text, Pop(1)},
|
||||
{`0[0-7]+`, LiteralNumberOct, nil},
|
||||
{`0x[\da-f]+`, LiteralNumberHex, nil},
|
||||
{`\d+`, LiteralNumberInteger, nil},
|
||||
{`[(),]+`, Punctuation, nil},
|
||||
{`([=+\-*/!~]|%|\^\^)+`, Operator, nil},
|
||||
{`((?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(\^[\n\x1a]?)?[^()=+\-*/!~%^"\n\x1a&<>|\t\v\f\r ,;=\xa0]|\^[\n\x1a\t\v\f\r ,;=\xa0]?[^)])+`, UsingSelf("variable"), nil},
|
||||
{`(?=[\x00|&])`, Text, Pop(1)},
|
||||
Include("follow"),
|
||||
},
|
||||
"call/compound": {
|
||||
{`(?=\))`, Text, Pop(1)},
|
||||
{`(:?)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^)]|\^[\n\x1a]?[^)])*))`, ByGroups(Punctuation, NameLabel), Pop(1)},
|
||||
},
|
||||
"label/compound": {
|
||||
{`(?=\))`, Text, Pop(1)},
|
||||
{`((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^)]|\^[\n\x1a]?[^)])*)?)((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|\^[\n\x1a]?[^)]|[^"%^\n\x1a&<>|)])*)`, ByGroups(NameLabel, CommentSingle), Pop(1)},
|
||||
},
|
||||
"redirect/compound": {
|
||||
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])\d)?)(>>?&|<&)([\n\x1a\t\v\f\r ,;=\xa0]*)(\d)`, ByGroups(LiteralNumberInteger, Punctuation, Text, LiteralNumberInteger), nil},
|
||||
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])(?<!\^[\n\x1a])\d)?)(>>?|<)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0)])+))+))`, ByGroups(LiteralNumberInteger, Punctuation, UsingSelf("text")), nil},
|
||||
},
|
||||
"variable-or-escape": {
|
||||
{`(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))`, NameVariable, nil},
|
||||
{`%%|\^[\n\x1a]?(\^!|[\w\W])`, LiteralStringEscape, nil},
|
||||
},
|
||||
"string": {
|
||||
{`"`, LiteralStringDouble, Pop(1)},
|
||||
{`(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))`, NameVariable, nil},
|
||||
{`\^!|%%`, LiteralStringEscape, nil},
|
||||
{`[^"%^\n\x1a]+|[%^]`, LiteralStringDouble, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"sqstring": {
|
||||
Include("variable-or-escape"),
|
||||
{`[^%]+|%`, LiteralStringSingle, nil},
|
||||
},
|
||||
"bqstring": {
|
||||
Include("variable-or-escape"),
|
||||
{`[^%]+|%`, LiteralStringBacktick, nil},
|
||||
},
|
||||
"text": {
|
||||
{`"`, LiteralStringDouble, Push("string")},
|
||||
Include("variable-or-escape"),
|
||||
{`[^"%^\n\x1a&<>|\t\v\f\r ,;=\xa0\d)]+|.`, Text, nil},
|
||||
},
|
||||
"variable": {
|
||||
{`"`, LiteralStringDouble, Push("string")},
|
||||
Include("variable-or-escape"),
|
||||
{`[^"%^\n\x1a]+|.`, NameVariable, nil},
|
||||
},
|
||||
"for": {
|
||||
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(in)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(\()`, ByGroups(UsingSelf("text"), Keyword, UsingSelf("text"), Punctuation), Pop(1)},
|
||||
Include("follow"),
|
||||
},
|
||||
"for2": {
|
||||
{`\)`, Punctuation, nil},
|
||||
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(do(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))`, ByGroups(UsingSelf("text"), Keyword), Pop(1)},
|
||||
{`[\n\x1a]+`, Text, nil},
|
||||
Include("follow"),
|
||||
},
|
||||
"for/f": {
|
||||
{`(")((?:(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"])*?")([\n\x1a\t\v\f\r ,;=\xa0]*)(\))`, ByGroups(LiteralStringDouble, UsingSelf("string"), Text, Punctuation), nil},
|
||||
{`"`, LiteralStringDouble, Push("#pop", "for2", "string")},
|
||||
{`('(?:%%|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[\w\W])*?')([\n\x1a\t\v\f\r ,;=\xa0]*)(\))`, ByGroups(UsingSelf("sqstring"), Text, Punctuation), nil},
|
||||
{"(`(?:%%|(?:(?:%(?:\\*|(?:~[a-z]*(?:\\$[^:]+:)?)?\\d|[^%:\\n\\x1a]+(?::(?:~(?:-?\\d+)?(?:,(?:-?\\d+)?)?|(?:[^%\\n\\x1a^]|\\^[^%\\n\\x1a])[^=\\n\\x1a]*=(?:[^%\\n\\x1a^]|\\^[^%\\n\\x1a])*)?)?%))|(?:\\^?![^!:\\n\\x1a]+(?::(?:~(?:-?\\d+)?(?:,(?:-?\\d+)?)?|(?:[^!\\n\\x1a^]|\\^[^!\\n\\x1a])[^=\\n\\x1a]*=(?:[^!\\n\\x1a^]|\\^[^!\\n\\x1a])*)?)?\\^?!))|[\\w\\W])*?`)([\\n\\x1a\\t\\v\\f\\r ,;=\\xa0]*)(\\))", ByGroups(UsingSelf("bqstring"), Text, Punctuation), nil},
|
||||
Include("for2"),
|
||||
},
|
||||
"for/l": {
|
||||
{`-?\d+`, LiteralNumberInteger, nil},
|
||||
Include("for2"),
|
||||
},
|
||||
"if": {
|
||||
{`((?:cmdextversion|errorlevel)(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(\d+)`, ByGroups(Keyword, UsingSelf("text"), LiteralNumberInteger), Pop(1)},
|
||||
{`(defined(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))((?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(Keyword, UsingSelf("text"), UsingSelf("variable")), Pop(1)},
|
||||
{`(exist(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(Keyword, UsingSelf("text")), Pop(1)},
|
||||
{`((?:-?(?:0[0-7]+|0x[\da-f]+|\d+)(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))((?:equ|geq|gtr|leq|lss|neq))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)(?:-?(?:0[0-7]+|0x[\da-f]+|\d+)(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))`, ByGroups(UsingSelf("arithmetic"), OperatorWord, UsingSelf("arithmetic")), Pop(1)},
|
||||
{`(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+)`, UsingSelf("text"), Push("#pop", "if2")},
|
||||
},
|
||||
"if2": {
|
||||
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(==)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(UsingSelf("text"), Operator, UsingSelf("text")), Pop(1)},
|
||||
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))((?:equ|geq|gtr|leq|lss|neq))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(UsingSelf("text"), OperatorWord, UsingSelf("text")), Pop(1)},
|
||||
},
|
||||
"(?": {
|
||||
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
|
||||
{`\(`, Punctuation, Push("#pop", "else?", "root/compound")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"else?": {
|
||||
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
|
||||
{`else(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])`, Keyword, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
}
|
||||
}
|
||||
80
vendor/github.com/alecthomas/chroma/lexers/b/bibtex.go
generated
vendored
80
vendor/github.com/alecthomas/chroma/lexers/b/bibtex.go
generated
vendored
@@ -1,80 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Bibtex lexer.
|
||||
var Bibtex = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "BibTeX",
|
||||
Aliases: []string{"bib", "bibtex"},
|
||||
Filenames: []string{"*.bib"},
|
||||
MimeTypes: []string{"text/x-bibtex"},
|
||||
NotMultiline: true,
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
bibtexRules,
|
||||
))
|
||||
|
||||
func bibtexRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
Include("whitespace"),
|
||||
{`@comment`, Comment, nil},
|
||||
{`@preamble`, NameClass, Push("closing-brace", "value", "opening-brace")},
|
||||
{`@string`, NameClass, Push("closing-brace", "field", "opening-brace")},
|
||||
{"@[a-z_@!$&*+\\-./:;<>?\\[\\\\\\]^`|~][\\w@!$&*+\\-./:;<>?\\[\\\\\\]^`|~]*", NameClass, Push("closing-brace", "command-body", "opening-brace")},
|
||||
{`.+`, Comment, nil},
|
||||
},
|
||||
"opening-brace": {
|
||||
Include("whitespace"),
|
||||
{`[{(]`, Punctuation, Pop(1)},
|
||||
},
|
||||
"closing-brace": {
|
||||
Include("whitespace"),
|
||||
{`[})]`, Punctuation, Pop(1)},
|
||||
},
|
||||
"command-body": {
|
||||
Include("whitespace"),
|
||||
{`[^\s\,\}]+`, NameLabel, Push("#pop", "fields")},
|
||||
},
|
||||
"fields": {
|
||||
Include("whitespace"),
|
||||
{`,`, Punctuation, Push("field")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"field": {
|
||||
Include("whitespace"),
|
||||
{"[a-z_@!$&*+\\-./:;<>?\\[\\\\\\]^`|~][\\w@!$&*+\\-./:;<>?\\[\\\\\\]^`|~]*", NameAttribute, Push("value", "=")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"=": {
|
||||
Include("whitespace"),
|
||||
{`=`, Punctuation, Pop(1)},
|
||||
},
|
||||
"value": {
|
||||
Include("whitespace"),
|
||||
{"[a-z_@!$&*+\\-./:;<>?\\[\\\\\\]^`|~][\\w@!$&*+\\-./:;<>?\\[\\\\\\]^`|~]*", NameVariable, nil},
|
||||
{`"`, LiteralString, Push("quoted-string")},
|
||||
{`\{`, LiteralString, Push("braced-string")},
|
||||
{`[\d]+`, LiteralNumber, nil},
|
||||
{`#`, Punctuation, nil},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"quoted-string": {
|
||||
{`\{`, LiteralString, Push("braced-string")},
|
||||
{`"`, LiteralString, Pop(1)},
|
||||
{`[^\{\"]+`, LiteralString, nil},
|
||||
},
|
||||
"braced-string": {
|
||||
{`\{`, LiteralString, Push()},
|
||||
{`\}`, LiteralString, Pop(1)},
|
||||
{`[^\{\}]+`, LiteralString, nil},
|
||||
},
|
||||
"whitespace": {
|
||||
{`\s+`, Text, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
112
vendor/github.com/alecthomas/chroma/lexers/b/bicep.go
generated
vendored
112
vendor/github.com/alecthomas/chroma/lexers/b/bicep.go
generated
vendored
@@ -1,112 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Bicep lexer.
|
||||
var Bicep = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Bicep",
|
||||
Aliases: []string{"bicep"},
|
||||
Filenames: []string{"*.bicep"},
|
||||
},
|
||||
bicepRules,
|
||||
))
|
||||
|
||||
func bicepRules() Rules {
|
||||
bicepFunctions := []string{
|
||||
"any",
|
||||
"array",
|
||||
"concat",
|
||||
"contains",
|
||||
"empty",
|
||||
"first",
|
||||
"intersection",
|
||||
"items",
|
||||
"last",
|
||||
"length",
|
||||
"min",
|
||||
"max",
|
||||
"range",
|
||||
"skip",
|
||||
"take",
|
||||
"union",
|
||||
"dateTimeAdd",
|
||||
"utcNow",
|
||||
"deployment",
|
||||
"environment",
|
||||
"loadFileAsBase64",
|
||||
"loadTextContent",
|
||||
"int",
|
||||
"json",
|
||||
"extensionResourceId",
|
||||
"getSecret",
|
||||
"list",
|
||||
"listKeys",
|
||||
"listKeyValue",
|
||||
"listAccountSas",
|
||||
"listSecrets",
|
||||
"pickZones",
|
||||
"reference",
|
||||
"resourceId",
|
||||
"subscriptionResourceId",
|
||||
"tenantResourceId",
|
||||
"managementGroup",
|
||||
"resourceGroup",
|
||||
"subscription",
|
||||
"tenant",
|
||||
"base64",
|
||||
"base64ToJson",
|
||||
"base64ToString",
|
||||
"dataUri",
|
||||
"dataUriToString",
|
||||
"endsWith",
|
||||
"format",
|
||||
"guid",
|
||||
"indexOf",
|
||||
"lastIndexOf",
|
||||
"length",
|
||||
"newGuid",
|
||||
"padLeft",
|
||||
"replace",
|
||||
"split",
|
||||
"startsWith",
|
||||
"string",
|
||||
"substring",
|
||||
"toLower",
|
||||
"toUpper",
|
||||
"trim",
|
||||
"uniqueString",
|
||||
"uri",
|
||||
"uriComponent",
|
||||
"uriComponentToString",
|
||||
}
|
||||
|
||||
return Rules{
|
||||
"root": {
|
||||
{`//[^\n\r]+`, CommentSingle, nil},
|
||||
{`/\*.*?\*/`, CommentMultiline, nil},
|
||||
{`([']?\w+[']?)(:)`, ByGroups(NameProperty, Punctuation), nil},
|
||||
{`\b('(resourceGroup|subscription|managementGroup|tenant)')\b`, KeywordNamespace, nil},
|
||||
{`'[\w\$\{\(\)\}\.]{1,}?'`, LiteralStringInterpol, nil},
|
||||
{`('''|').*?('''|')`, LiteralString, nil},
|
||||
{`\b(allowed|batchSize|description|maxLength|maxValue|metadata|minLength|minValue|secure)\b`, NameDecorator, nil},
|
||||
{`\b(az|sys)\.`, NameNamespace, nil},
|
||||
{`\b(` + strings.Join(bicepFunctions, "|") + `)\b`, NameFunction, nil},
|
||||
// https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/bicep-functions-logical
|
||||
{`\b(bool)(\()`, ByGroups(NameFunction, Punctuation), nil},
|
||||
{`\b(for|if|in)\b`, Keyword, nil},
|
||||
{`\b(module|output|param|resource|var)\b`, KeywordDeclaration, nil},
|
||||
{`\b(array|bool|int|object|string)\b`, KeywordType, nil},
|
||||
// https://docs.microsoft.com/en-us/azure/azure-resource-manager/bicep/operators
|
||||
{`(>=|>|<=|<|==|!=|=~|!~|::|&&|\?\?|!|-|%|\*|\/|\+)`, Operator, nil},
|
||||
{`[\(\)\[\]\.:\?{}@=]`, Punctuation, nil},
|
||||
{`[\w_-]+`, Text, nil},
|
||||
{`\s+`, TextWhitespace, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
52
vendor/github.com/alecthomas/chroma/lexers/b/blitz.go
generated
vendored
52
vendor/github.com/alecthomas/chroma/lexers/b/blitz.go
generated
vendored
@@ -1,52 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Blitzbasic lexer.
|
||||
var Blitzbasic = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "BlitzBasic",
|
||||
Aliases: []string{"blitzbasic", "b3d", "bplus"},
|
||||
Filenames: []string{"*.bb", "*.decls"},
|
||||
MimeTypes: []string{"text/x-bb"},
|
||||
CaseInsensitive: true,
|
||||
},
|
||||
blitzbasicRules,
|
||||
))
|
||||
|
||||
func blitzbasicRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`[ \t]+`, Text, nil},
|
||||
{`;.*?\n`, CommentSingle, nil},
|
||||
{`"`, LiteralStringDouble, Push("string")},
|
||||
{`[0-9]+\.[0-9]*(?!\.)`, LiteralNumberFloat, nil},
|
||||
{`\.[0-9]+(?!\.)`, LiteralNumberFloat, nil},
|
||||
{`[0-9]+`, LiteralNumberInteger, nil},
|
||||
{`\$[0-9a-f]+`, LiteralNumberHex, nil},
|
||||
{`\%[10]+`, LiteralNumberBin, nil},
|
||||
{Words(`\b`, `\b`, `Shl`, `Shr`, `Sar`, `Mod`, `Or`, `And`, `Not`, `Abs`, `Sgn`, `Handle`, `Int`, `Float`, `Str`, `First`, `Last`, `Before`, `After`), Operator, nil},
|
||||
{`([+\-*/~=<>^])`, Operator, nil},
|
||||
{`[(),:\[\]\\]`, Punctuation, nil},
|
||||
{`\.([ \t]*)([a-z]\w*)`, NameLabel, nil},
|
||||
{`\b(New)\b([ \t]+)([a-z]\w*)`, ByGroups(KeywordReserved, Text, NameClass), nil},
|
||||
{`\b(Gosub|Goto)\b([ \t]+)([a-z]\w*)`, ByGroups(KeywordReserved, Text, NameLabel), nil},
|
||||
{`\b(Object)\b([ \t]*)([.])([ \t]*)([a-z]\w*)\b`, ByGroups(Operator, Text, Punctuation, Text, NameClass), nil},
|
||||
{`\b([a-z]\w*)(?:([ \t]*)(@{1,2}|[#$%])|([ \t]*)([.])([ \t]*)(?:([a-z]\w*)))?\b([ \t]*)(\()`, ByGroups(NameFunction, Text, KeywordType, Text, Punctuation, Text, NameClass, Text, Punctuation), nil},
|
||||
{`\b(Function)\b([ \t]+)([a-z]\w*)(?:([ \t]*)(@{1,2}|[#$%])|([ \t]*)([.])([ \t]*)(?:([a-z]\w*)))?`, ByGroups(KeywordReserved, Text, NameFunction, Text, KeywordType, Text, Punctuation, Text, NameClass), nil},
|
||||
{`\b(Type)([ \t]+)([a-z]\w*)`, ByGroups(KeywordReserved, Text, NameClass), nil},
|
||||
{`\b(Pi|True|False|Null)\b`, KeywordConstant, nil},
|
||||
{`\b(Local|Global|Const|Field|Dim)\b`, KeywordDeclaration, nil},
|
||||
{Words(`\b`, `\b`, `End`, `Return`, `Exit`, `Chr`, `Len`, `Asc`, `New`, `Delete`, `Insert`, `Include`, `Function`, `Type`, `If`, `Then`, `Else`, `ElseIf`, `EndIf`, `For`, `To`, `Next`, `Step`, `Each`, `While`, `Wend`, `Repeat`, `Until`, `Forever`, `Select`, `Case`, `Default`, `Goto`, `Gosub`, `Data`, `Read`, `Restore`), KeywordReserved, nil},
|
||||
{`([a-z]\w*)(?:([ \t]*)(@{1,2}|[#$%])|([ \t]*)([.])([ \t]*)(?:([a-z]\w*)))?`, ByGroups(NameVariable, Text, KeywordType, Text, Punctuation, Text, NameClass), nil},
|
||||
},
|
||||
"string": {
|
||||
{`""`, LiteralStringDouble, nil},
|
||||
{`"C?`, LiteralStringDouble, Pop(1)},
|
||||
{`[^"]+`, LiteralStringDouble, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
28
vendor/github.com/alecthomas/chroma/lexers/b/bnf.go
generated
vendored
28
vendor/github.com/alecthomas/chroma/lexers/b/bnf.go
generated
vendored
@@ -1,28 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Bnf lexer.
|
||||
var Bnf = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "BNF",
|
||||
Aliases: []string{"bnf"},
|
||||
Filenames: []string{"*.bnf"},
|
||||
MimeTypes: []string{"text/x-bnf"},
|
||||
},
|
||||
bnfRules,
|
||||
))
|
||||
|
||||
func bnfRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`(<)([ -;=?-~]+)(>)`, ByGroups(Punctuation, NameClass, Punctuation), nil},
|
||||
{`::=`, Operator, nil},
|
||||
{`[^<>:]+`, Text, nil},
|
||||
{`.`, Text, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
38
vendor/github.com/alecthomas/chroma/lexers/b/brainfuck.go
generated
vendored
38
vendor/github.com/alecthomas/chroma/lexers/b/brainfuck.go
generated
vendored
@@ -1,38 +0,0 @@
|
||||
package b
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Brainfuck lexer.
|
||||
var Brainfuck = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Brainfuck",
|
||||
Aliases: []string{"brainfuck", "bf"},
|
||||
Filenames: []string{"*.bf", "*.b"},
|
||||
MimeTypes: []string{"application/x-brainfuck"},
|
||||
},
|
||||
brainfuckRules,
|
||||
))
|
||||
|
||||
func brainfuckRules() Rules {
|
||||
return Rules{
|
||||
"common": {
|
||||
{`[.,]+`, NameTag, nil},
|
||||
{`[+-]+`, NameBuiltin, nil},
|
||||
{`[<>]+`, NameVariable, nil},
|
||||
{`[^.,+\-<>\[\]]+`, Comment, nil},
|
||||
},
|
||||
"root": {
|
||||
{`\[`, Keyword, Push("loop")},
|
||||
{`\]`, Error, nil},
|
||||
Include("common"),
|
||||
},
|
||||
"loop": {
|
||||
{`\[`, Keyword, Push()},
|
||||
{`\]`, Keyword, Pop(1)},
|
||||
Include("common"),
|
||||
},
|
||||
}
|
||||
}
|
||||
96
vendor/github.com/alecthomas/chroma/lexers/c/c.go
generated
vendored
96
vendor/github.com/alecthomas/chroma/lexers/c/c.go
generated
vendored
@@ -1,96 +0,0 @@
|
||||
package c
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// C lexer.
|
||||
var C = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "C",
|
||||
Aliases: []string{"c"},
|
||||
Filenames: []string{"*.c", "*.h", "*.idc", "*.x[bp]m"},
|
||||
MimeTypes: []string{"text/x-chdr", "text/x-csrc", "image/x-xbitmap", "image/x-xpixmap"},
|
||||
EnsureNL: true,
|
||||
},
|
||||
cRules,
|
||||
))
|
||||
|
||||
func cRules() Rules {
|
||||
return Rules{
|
||||
"whitespace": {
|
||||
{`^#if\s+0`, CommentPreproc, Push("if0")},
|
||||
{`^#`, CommentPreproc, Push("macro")},
|
||||
{`^(\s*(?:/[*].*?[*]/\s*)?)(#if\s+0)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("if0")},
|
||||
{`^(\s*(?:/[*].*?[*]/\s*)?)(#)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("macro")},
|
||||
{`\n`, Text, nil},
|
||||
{`\s+`, Text, nil},
|
||||
{`\\\n`, Text, nil},
|
||||
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
|
||||
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
|
||||
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
|
||||
},
|
||||
"statements": {
|
||||
{`(L?)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
|
||||
{`(L?)(')(\\.|\\[0-7]{1,3}|\\x[a-fA-F0-9]{1,2}|[^\\\'\n])(')`, ByGroups(LiteralStringAffix, LiteralStringChar, LiteralStringChar, LiteralStringChar), nil},
|
||||
{`(\d+\.\d*|\.\d+|\d+)[eE][+-]?\d+[LlUu]*`, LiteralNumberFloat, nil},
|
||||
{`(\d+\.\d*|\.\d+|\d+[fF])[fF]?`, LiteralNumberFloat, nil},
|
||||
{`0x[0-9a-fA-F]+[LlUu]*`, LiteralNumberHex, nil},
|
||||
{`0[0-7]+[LlUu]*`, LiteralNumberOct, nil},
|
||||
{`\d+[LlUu]*`, LiteralNumberInteger, nil},
|
||||
{`\*/`, Error, nil},
|
||||
{`[~!%^&*+=|?:<>/-]`, Operator, nil},
|
||||
{`[()\[\],.]`, Punctuation, nil},
|
||||
{Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil},
|
||||
{`(bool|int|long|float|short|double|char((8|16|32)_t)?|unsigned|signed|void|u?int(_fast|_least|)(8|16|32|64)_t)\b`, KeywordType, nil},
|
||||
{Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil},
|
||||
{`(__m(128i|128d|128|64))\b`, KeywordReserved, nil},
|
||||
{Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `wchar_t`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil},
|
||||
{`(true|false|NULL)\b`, NameBuiltin, nil},
|
||||
{`([a-zA-Z_]\w*)(\s*)(:)(?!:)`, ByGroups(NameLabel, Text, Punctuation), nil},
|
||||
{`[a-zA-Z_]\w*`, Name, nil},
|
||||
},
|
||||
"root": {
|
||||
Include("whitespace"),
|
||||
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;{]*)(\{)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), Push("function")},
|
||||
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;]*)(;)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), nil},
|
||||
Default(Push("statement")),
|
||||
},
|
||||
"statement": {
|
||||
Include("whitespace"),
|
||||
Include("statements"),
|
||||
{`[{}]`, Punctuation, nil},
|
||||
{`;`, Punctuation, Pop(1)},
|
||||
},
|
||||
"function": {
|
||||
Include("whitespace"),
|
||||
Include("statements"),
|
||||
{`;`, Punctuation, nil},
|
||||
{`\{`, Punctuation, Push()},
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
},
|
||||
"string": {
|
||||
{`"`, LiteralString, Pop(1)},
|
||||
{`\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|[0-7]{1,3})`, LiteralStringEscape, nil},
|
||||
{`[^\\"\n]+`, LiteralString, nil},
|
||||
{`\\\n`, LiteralString, nil},
|
||||
{`\\`, LiteralString, nil},
|
||||
},
|
||||
"macro": {
|
||||
{`(include)(\s*(?:/[*].*?[*]/\s*)?)([^\n]+)`, ByGroups(CommentPreproc, Text, CommentPreprocFile), nil},
|
||||
{`[^/\n]+`, CommentPreproc, nil},
|
||||
{`/[*](.|\n)*?[*]/`, CommentMultiline, nil},
|
||||
{`//.*?\n`, CommentSingle, Pop(1)},
|
||||
{`/`, CommentPreproc, nil},
|
||||
{`(?<=\\)\n`, CommentPreproc, nil},
|
||||
{`\n`, CommentPreproc, Pop(1)},
|
||||
},
|
||||
"if0": {
|
||||
{`^\s*#if.*?(?<!\\)\n`, CommentPreproc, Push()},
|
||||
{`^\s*#el(?:se|if).*\n`, CommentPreproc, Pop(1)},
|
||||
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
|
||||
{`.*?\n`, Comment, nil},
|
||||
},
|
||||
}
|
||||
}
|
||||
65
vendor/github.com/alecthomas/chroma/lexers/c/capnproto.go
generated
vendored
65
vendor/github.com/alecthomas/chroma/lexers/c/capnproto.go
generated
vendored
@@ -1,65 +0,0 @@
|
||||
package c
|
||||
|
||||
import (
|
||||
. "github.com/alecthomas/chroma" // nolint
|
||||
"github.com/alecthomas/chroma/lexers/internal"
|
||||
)
|
||||
|
||||
// Cap'N'Proto Proto lexer.
|
||||
var CapNProto = internal.Register(MustNewLazyLexer(
|
||||
&Config{
|
||||
Name: "Cap'n Proto",
|
||||
Aliases: []string{"capnp"},
|
||||
Filenames: []string{"*.capnp"},
|
||||
MimeTypes: []string{},
|
||||
},
|
||||
capNProtoRules,
|
||||
))
|
||||
|
||||
func capNProtoRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{`#.*?$`, CommentSingle, nil},
|
||||
{`@[0-9a-zA-Z]*`, NameDecorator, nil},
|
||||
{`=`, Literal, Push("expression")},
|
||||
{`:`, NameClass, Push("type")},
|
||||
{`\$`, NameAttribute, Push("annotation")},
|
||||
{`(struct|enum|interface|union|import|using|const|annotation|extends|in|of|on|as|with|from|fixed)\b`, Keyword, nil},
|
||||
{`[\w.]+`, Name, nil},
|
||||
{`[^#@=:$\w]+`, Text, nil},
|
||||
},
|
||||
"type": {
|
||||
{`[^][=;,(){}$]+`, NameClass, nil},
|
||||
{`[[(]`, NameClass, Push("parentype")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"parentype": {
|
||||
{`[^][;()]+`, NameClass, nil},
|
||||
{`[[(]`, NameClass, Push()},
|
||||
{`[])]`, NameClass, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"expression": {
|
||||
{`[^][;,(){}$]+`, Literal, nil},
|
||||
{`[[(]`, Literal, Push("parenexp")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"parenexp": {
|
||||
{`[^][;()]+`, Literal, nil},
|
||||
{`[[(]`, Literal, Push()},
|
||||
{`[])]`, Literal, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"annotation": {
|
||||
{`[^][;,(){}=:]+`, NameAttribute, nil},
|
||||
{`[[(]`, NameAttribute, Push("annexp")},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
"annexp": {
|
||||
{`[^][;()]+`, NameAttribute, nil},
|
||||
{`[[(]`, NameAttribute, Push()},
|
||||
{`[])]`, NameAttribute, Pop(1)},
|
||||
Default(Pop(1)),
|
||||
},
|
||||
}
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user