mirror of
https://github.com/cheat/cheat.git
synced 2026-03-07 11:13:33 +01:00
Compare commits
1 Commits
dependabot
...
4.5.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
cc85a4bdb1 |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -1,2 +1,5 @@
|
||||
dist
|
||||
tags
|
||||
.tmp
|
||||
*.test
|
||||
.claude
|
||||
|
||||
117
CLAUDE.md
Normal file
117
CLAUDE.md
Normal file
@@ -0,0 +1,117 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Common Development Commands
|
||||
|
||||
### Building
|
||||
```bash
|
||||
# Build for your architecture
|
||||
make build
|
||||
|
||||
# Build release binaries for all platforms
|
||||
make build-release
|
||||
|
||||
# Install cheat to your PATH
|
||||
make install
|
||||
```
|
||||
|
||||
### Testing and Quality Checks
|
||||
```bash
|
||||
# Run all tests
|
||||
make test
|
||||
go test ./...
|
||||
|
||||
# Run a single test
|
||||
go test -run TestFunctionName ./internal/package_name
|
||||
|
||||
# Generate test coverage report
|
||||
make coverage
|
||||
|
||||
# Run linter (revive)
|
||||
make lint
|
||||
|
||||
# Run go vet
|
||||
make vet
|
||||
|
||||
# Format code
|
||||
make fmt
|
||||
|
||||
# Run all checks (vendor, fmt, lint, vet, test)
|
||||
make check
|
||||
```
|
||||
|
||||
### Development Setup
|
||||
```bash
|
||||
# Install development dependencies (revive linter, scc)
|
||||
make setup
|
||||
|
||||
# Update and verify vendored dependencies
|
||||
make vendor-update
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
The `cheat` command-line tool is organized into several key packages:
|
||||
|
||||
### Command Layer (`cmd/cheat/`)
|
||||
- `main.go`: Entry point, argument parsing, command routing
|
||||
- `cmd_*.go`: Individual command implementations (view, edit, list, search, etc.)
|
||||
- Commands are selected based on docopt parsed arguments
|
||||
|
||||
### Core Internal Packages
|
||||
|
||||
1. **`internal/config`**: Configuration management
|
||||
- Loads YAML config from platform-specific paths
|
||||
- Manages editor, pager, colorization settings
|
||||
- Validates and expands cheatpath configurations
|
||||
|
||||
2. **`internal/cheatpath`**: Cheatsheet path management
|
||||
- Represents collections of cheatsheets on filesystem
|
||||
- Handles read-only vs writable paths
|
||||
- Supports filtering and validation
|
||||
|
||||
3. **`internal/sheet`**: Individual cheatsheet handling
|
||||
- Parses YAML frontmatter for tags and syntax
|
||||
- Implements syntax highlighting via Chroma
|
||||
- Provides search functionality within sheets
|
||||
|
||||
4. **`internal/sheets`**: Collection operations
|
||||
- Loads sheets from multiple cheatpaths
|
||||
- Consolidates duplicates (local overrides global)
|
||||
- Filters by tags and sorts results
|
||||
|
||||
5. **`internal/display`**: Output formatting
|
||||
- Writes to stdout or pager
|
||||
- Handles text formatting and indentation
|
||||
|
||||
6. **`internal/repo`**: Git repository management
|
||||
- Clones community cheatsheet repositories
|
||||
- Updates existing repositories
|
||||
|
||||
### Key Design Patterns
|
||||
|
||||
- **Filesystem-based storage**: Cheatsheets are plain text files
|
||||
- **Override mechanism**: Local sheets override community sheets with same name
|
||||
- **Tag system**: Sheets can be categorized with tags in frontmatter
|
||||
- **Multiple cheatpaths**: Supports personal, community, and directory-scoped sheets
|
||||
|
||||
### Sheet Format
|
||||
|
||||
Cheatsheets are plain text files optionally prefixed with YAML frontmatter:
|
||||
```
|
||||
---
|
||||
syntax: bash
|
||||
tags: [ networking, ssh ]
|
||||
---
|
||||
# SSH tunneling example
|
||||
ssh -L 8080:localhost:80 user@remote
|
||||
```
|
||||
|
||||
### Working with the Codebase
|
||||
|
||||
- Always check for `.git` directories and skip them during filesystem walks
|
||||
- Use `go-git` for repository operations, not exec'ing git commands
|
||||
- Platform-specific paths are handled in `internal/config/paths.go`
|
||||
- Color output uses ANSI codes via the Chroma library
|
||||
- Test files use the `internal/mock` package for test data
|
||||
@@ -1,48 +1,14 @@
|
||||
CONTRIBUTING
|
||||
Contributing
|
||||
============
|
||||
Do you want to contribute to `cheat`? There are a few ways to help:
|
||||
|
||||
#### Submit a cheatsheet ####
|
||||
Do you have a witty bash one-liner to share? [Open a pull-request][pr] against
|
||||
the [cheatsheets][] repository. (The `cheat` executable source code lives in
|
||||
[cheat/cheat][cheat]. Cheatsheet content lives in
|
||||
[cheat/cheatsheets][cheatsheets].)
|
||||
Thank you for your interest in `cheat`.
|
||||
|
||||
#### Report a bug ####
|
||||
Did you find a bug? Report it in the [issue tracker][issues]. (But before you
|
||||
do, please look through the open issues to make sure that it hasn't already
|
||||
been reported.)
|
||||
Pull requests are no longer being accepted, and have been disabled on this
|
||||
repository. The maintainer is not currently reviewing or merging external code
|
||||
contributions.
|
||||
|
||||
#### Add a feature ####
|
||||
Do you have a feature that you'd like to contribute? Propose it in the [issue
|
||||
tracker][issues] to discuss with the maintainer whether it would be considered
|
||||
for merging.
|
||||
Bug reports are still welcome. If you've found a bug, please open an issue in
|
||||
the [issue tracker][issues]. Before doing so, please search through the
|
||||
existing open issues to make sure it hasn't already been reported.
|
||||
|
||||
`cheat` is mostly mature and feature-complete, but may still have some room for
|
||||
new features. See [HACKING.md][hacking] for a quick-start guide to `cheat`
|
||||
development.
|
||||
|
||||
#### Add documentation ####
|
||||
Did you encounter features, bugs, edge-cases, use-cases, or environment
|
||||
considerations that were undocumented or under-documented? Add them to the
|
||||
[wiki][]. (You may also open a pull-request against the `README`, if
|
||||
appropriate.)
|
||||
|
||||
Do you enjoy technical writing or proofreading? Help keep the documentation
|
||||
error-free and well-organized.
|
||||
|
||||
#### Spread the word ####
|
||||
Are you unable to do the above, but still want to contribute? You can help
|
||||
`cheat` simply by telling others about it. Share it with friends and coworkers
|
||||
that might benefit from using it.
|
||||
|
||||
#### Pull Requests ####
|
||||
Please open all pull-requests against the `develop` branch.
|
||||
|
||||
|
||||
[cheat]: https://github.com/cheat/cheat
|
||||
[cheatsheets]: https://github.com/cheat/cheatsheets
|
||||
[hacking]: HACKING.md
|
||||
[issues]: https://github.com/cheat/cheat/issues
|
||||
[pr]: https://help.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request-from-a-fork
|
||||
[wiki]: https://github.com/cheat/cheat/wiki
|
||||
[issues]: https://github.com/cheat/cheat/issues
|
||||
|
||||
250
HACKING.md
250
HACKING.md
@@ -1,57 +1,241 @@
|
||||
Hacking
|
||||
=======
|
||||
The following is a quickstart guide for developing `cheat`.
|
||||
# Hacking Guide
|
||||
|
||||
## 1. Install system dependencies
|
||||
Before you begin, you must install a handful of system dependencies. The
|
||||
following are required, and must be available on your `PATH`:
|
||||
This document provides a comprehensive guide for developing `cheat`, including setup, architecture overview, and code patterns.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Install system dependencies
|
||||
|
||||
The following are required and must be available on your `PATH`:
|
||||
- `git`
|
||||
- `go` (>= 1.17 is recommended)
|
||||
- `go` (>= 1.19 is recommended)
|
||||
- `make`
|
||||
|
||||
The following dependencies are optional:
|
||||
Optional dependencies:
|
||||
- `docker`
|
||||
- `pandoc` (necessary to generate a `man` page)
|
||||
|
||||
## 2. Install utility applications
|
||||
Run `make setup` to install `scc` and `revive`, which are used by various
|
||||
`make` targets.
|
||||
### 2. Install utility applications
|
||||
Run `make setup` to install `scc` and `revive`, which are used by various `make` targets.
|
||||
|
||||
## 3. Development workflow
|
||||
After your environment has been configured, your development workflow will
|
||||
resemble the following:
|
||||
### 3. Development workflow
|
||||
|
||||
1. Make changes to the `cheat` source code.
|
||||
2. Run `make test` to run unit-tests.
|
||||
3. Fix compiler errors and failing tests as necessary.
|
||||
4. Run `make`. A `cheat` executable will be written to the `dist` directory.
|
||||
5. Use the new executable by running `dist/cheat <command>`.
|
||||
6. Run `make install` to install `cheat` to your `PATH`.
|
||||
7. Run `make build-release` to build cross-platform binaries in `dist`.
|
||||
8. Run `make clean` to clean the `dist` directory when desired.
|
||||
1. Make changes to the `cheat` source code
|
||||
2. Run `make test` to run unit-tests
|
||||
3. Fix compiler errors and failing tests as necessary
|
||||
4. Run `make build`. A `cheat` executable will be written to the `dist` directory
|
||||
5. Use the new executable by running `dist/cheat <command>`
|
||||
6. Run `make install` to install `cheat` to your `PATH`
|
||||
7. Run `make build-release` to build cross-platform binaries in `dist`
|
||||
8. Run `make clean` to clean the `dist` directory when desired
|
||||
|
||||
You may run `make help` to see a list of available `make` commands.
|
||||
|
||||
### Developing with docker
|
||||
It may be useful to test your changes within a pristine environment. An
|
||||
Alpine-based docker container has been provided for that purpose.
|
||||
### 4. Testing
|
||||
|
||||
If you would like to build the docker container, run:
|
||||
```sh
|
||||
#### Unit Tests
|
||||
Run unit tests with:
|
||||
```bash
|
||||
make test
|
||||
```
|
||||
|
||||
#### Integration Tests
|
||||
Integration tests that require network access are separated using build tags. Run them with:
|
||||
```bash
|
||||
make test-integration
|
||||
```
|
||||
|
||||
To run all tests (unit and integration):
|
||||
```bash
|
||||
make test-all
|
||||
```
|
||||
|
||||
#### Test Coverage
|
||||
Generate a coverage report with:
|
||||
```bash
|
||||
make coverage # HTML report
|
||||
make coverage-text # Terminal output
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Package Structure
|
||||
|
||||
The `cheat` application follows a clean architecture with well-separated concerns:
|
||||
|
||||
- **`cmd/cheat/`**: Command layer with argument parsing and command routing
|
||||
- **`internal/config`**: Configuration management (YAML loading, validation, paths)
|
||||
- **`internal/cheatpath`**: Cheatsheet path management (collections, filtering)
|
||||
- **`internal/sheet`**: Individual cheatsheet handling (parsing, search, highlighting)
|
||||
- **`internal/sheets`**: Collection operations (loading, consolidation, filtering)
|
||||
- **`internal/display`**: Output formatting (pager integration, colorization)
|
||||
- **`internal/repo`**: Git repository management for community sheets
|
||||
|
||||
### Key Design Patterns
|
||||
|
||||
- **Filesystem-based storage**: Cheatsheets are plain text files
|
||||
- **Override mechanism**: Local sheets override community sheets with same name
|
||||
- **Tag system**: Sheets can be categorized with tags in frontmatter
|
||||
- **Multiple cheatpaths**: Supports personal, community, and directory-scoped sheets
|
||||
|
||||
## Core Types and Functions
|
||||
|
||||
### Config (`internal/config`)
|
||||
|
||||
The main configuration structure:
|
||||
|
||||
```go
|
||||
type Config struct {
|
||||
Colorize bool `yaml:"colorize"`
|
||||
Editor string `yaml:"editor"`
|
||||
Cheatpaths []cp.Cheatpath `yaml:"cheatpaths"`
|
||||
Style string `yaml:"style"`
|
||||
Formatter string `yaml:"formatter"`
|
||||
Pager string `yaml:"pager"`
|
||||
Path string
|
||||
}
|
||||
```
|
||||
|
||||
Key functions:
|
||||
- `New(opts, confPath, resolve)` - Load config from file
|
||||
- `Validate()` - Validate configuration values
|
||||
- `Editor()` - Get editor from environment or defaults (package-level function)
|
||||
- `Pager()` - Get pager from environment or defaults (package-level function)
|
||||
|
||||
### Cheatpath (`internal/cheatpath`)
|
||||
|
||||
Represents a directory containing cheatsheets:
|
||||
|
||||
```go
|
||||
type Cheatpath struct {
|
||||
Name string // Friendly name (e.g., "personal")
|
||||
Path string // Filesystem path
|
||||
Tags []string // Tags applied to all sheets in this path
|
||||
ReadOnly bool // Whether sheets can be modified
|
||||
}
|
||||
```
|
||||
|
||||
### Sheet (`internal/sheet`)
|
||||
|
||||
Represents an individual cheatsheet:
|
||||
|
||||
```go
|
||||
type Sheet struct {
|
||||
Title string // Sheet name (from filename)
|
||||
CheatPath string // Name of the cheatpath this sheet belongs to
|
||||
Path string // Full filesystem path
|
||||
Text string // Content (without frontmatter)
|
||||
Tags []string // Combined tags (from frontmatter + cheatpath)
|
||||
Syntax string // Syntax for highlighting
|
||||
ReadOnly bool // Whether sheet can be edited
|
||||
}
|
||||
```
|
||||
|
||||
Key methods:
|
||||
- `New(title, cheatpath, path, tags, readOnly)` - Load from file
|
||||
- `Search(reg)` - Search content with a compiled regexp
|
||||
- `Colorize(conf)` - Apply syntax highlighting (modifies sheet in place)
|
||||
- `Tagged(needle)` - Check if sheet has the given tag
|
||||
|
||||
## Common Operations
|
||||
|
||||
### Loading and Displaying a Sheet
|
||||
|
||||
```go
|
||||
// Load sheet
|
||||
s, err := sheet.New("tar", "personal", "/path/to/tar", []string{"personal"}, false)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Apply syntax highlighting (modifies sheet in place)
|
||||
s.Colorize(conf)
|
||||
|
||||
// Display with pager
|
||||
display.Write(s.Text, conf)
|
||||
```
|
||||
|
||||
### Working with Sheet Collections
|
||||
|
||||
```go
|
||||
// Load all sheets from cheatpaths (returns a slice of maps, one per cheatpath)
|
||||
allSheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Consolidate to handle duplicates (later cheatpaths take precedence)
|
||||
consolidated := sheets.Consolidate(allSheets)
|
||||
|
||||
// Filter by tag (operates on the slice of maps)
|
||||
filtered := sheets.Filter(allSheets, []string{"networking"})
|
||||
|
||||
// Sort alphabetically (returns a sorted slice)
|
||||
sorted := sheets.Sort(consolidated)
|
||||
```
|
||||
|
||||
### Sheet Format
|
||||
|
||||
Cheatsheets are plain text files that may begin with YAML frontmatter:
|
||||
|
||||
```yaml
|
||||
---
|
||||
syntax: bash
|
||||
tags: [networking, linux, ssh]
|
||||
---
|
||||
# Connect to remote server
|
||||
ssh user@hostname
|
||||
|
||||
# Copy files over SSH
|
||||
scp local_file user@hostname:/remote/path
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
Run tests with:
|
||||
```bash
|
||||
make test # Run all tests
|
||||
make coverage # Generate coverage report
|
||||
go test ./... # Go test directly
|
||||
```
|
||||
|
||||
Test files follow Go conventions:
|
||||
- `*_test.go` files in same package
|
||||
- Table-driven tests for multiple scenarios
|
||||
- Mock data in `internal/mock` package
|
||||
|
||||
## Error Handling
|
||||
|
||||
The codebase follows consistent error handling patterns:
|
||||
- Functions return explicit errors
|
||||
- Errors are wrapped with context using `fmt.Errorf`
|
||||
- User-facing errors are written to stderr
|
||||
|
||||
Example:
|
||||
```go
|
||||
sheet, err := sheet.New(path, tags, false)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to load sheet: %w", err)
|
||||
}
|
||||
```
|
||||
|
||||
## Developing with Docker
|
||||
|
||||
It may be useful to test your changes within a pristine environment. An Alpine-based docker container has been provided for that purpose.
|
||||
|
||||
Build the docker container:
|
||||
```bash
|
||||
make docker-setup
|
||||
```
|
||||
|
||||
To shell into the container, run:
|
||||
```sh
|
||||
Shell into the container:
|
||||
```bash
|
||||
make docker-sh
|
||||
```
|
||||
|
||||
The `cheat` source code will be mounted at `/app` within the container.
|
||||
|
||||
If you would like to destroy this container, you may run:
|
||||
```sh
|
||||
To destroy the container:
|
||||
```bash
|
||||
make distclean
|
||||
```
|
||||
|
||||
[go]: https://go.dev/
|
||||
|
||||
@@ -9,20 +9,20 @@ On Unix-like systems, you may simply paste the following snippet into your termi
|
||||
|
||||
```sh
|
||||
cd /tmp \
|
||||
&& wget https://github.com/cheat/cheat/releases/download/4.4.2/cheat-linux-amd64.gz \
|
||||
&& wget https://github.com/cheat/cheat/releases/download/4.5.0/cheat-linux-amd64.gz \
|
||||
&& gunzip cheat-linux-amd64.gz \
|
||||
&& chmod +x cheat-linux-amd64 \
|
||||
&& sudo mv cheat-linux-amd64 /usr/local/bin/cheat
|
||||
```
|
||||
|
||||
You may need to need to change the version number (`4.4.2`) and the archive
|
||||
You may need to need to change the version number (`4.5.0`) and the archive
|
||||
(`cheat-linux-amd64.gz`) depending on your platform.
|
||||
|
||||
See the [releases page][releases] for a list of supported platforms.
|
||||
|
||||
#### Windows
|
||||
TODO: community support is requested here. Please open a PR if you'd like to
|
||||
contribute installation instructions for Windows.
|
||||
On Windows, download the appropriate binary from the [releases page][releases],
|
||||
unzip the archive, and place the `cheat.exe` executable on your `PATH`.
|
||||
|
||||
### Install via `go install`
|
||||
If you have `go` version `>=1.17` available on your `PATH`, you can install
|
||||
|
||||
113
Makefile
113
Makefile
@@ -3,6 +3,9 @@ makefile := $(realpath $(lastword $(MAKEFILE_LIST)))
|
||||
cmd_dir := ./cmd/cheat
|
||||
dist_dir := ./dist
|
||||
|
||||
# parallel jobs for build-release (can be overridden)
|
||||
JOBS ?= 8
|
||||
|
||||
# executables
|
||||
CAT := cat
|
||||
COLUMN := column
|
||||
@@ -31,6 +34,7 @@ TMPDIR := /tmp
|
||||
# release binaries
|
||||
releases := \
|
||||
$(dist_dir)/cheat-darwin-amd64 \
|
||||
$(dist_dir)/cheat-darwin-arm64 \
|
||||
$(dist_dir)/cheat-linux-386 \
|
||||
$(dist_dir)/cheat-linux-amd64 \
|
||||
$(dist_dir)/cheat-linux-arm5 \
|
||||
@@ -44,70 +48,78 @@ releases := \
|
||||
|
||||
## build: build an executable for your architecture
|
||||
.PHONY: build
|
||||
build: | clean $(dist_dir) generate fmt lint vet vendor man
|
||||
build: | clean $(dist_dir) fmt lint vet vendor man
|
||||
$(GO) build $(BUILD_FLAGS) -o $(dist_dir)/cheat $(cmd_dir)
|
||||
|
||||
## build-release: build release executables
|
||||
# Runs prepare once, then builds all binaries in parallel
|
||||
# Override jobs with: make build-release JOBS=16
|
||||
.PHONY: build-release
|
||||
build-release: $(releases)
|
||||
build-release: prepare
|
||||
$(MAKE) -j$(JOBS) $(releases)
|
||||
|
||||
# cheat-darwin-amd64
|
||||
$(dist_dir)/cheat-darwin-amd64: prepare
|
||||
$(dist_dir)/cheat-darwin-amd64:
|
||||
GOARCH=amd64 GOOS=darwin \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-darwin-arm64
|
||||
$(dist_dir)/cheat-darwin-arm64:
|
||||
GOARCH=arm64 GOOS=darwin \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-386
|
||||
$(dist_dir)/cheat-linux-386: prepare
|
||||
$(dist_dir)/cheat-linux-386:
|
||||
GOARCH=386 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-amd64
|
||||
$(dist_dir)/cheat-linux-amd64: prepare
|
||||
$(dist_dir)/cheat-linux-amd64:
|
||||
GOARCH=amd64 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm5
|
||||
$(dist_dir)/cheat-linux-arm5: prepare
|
||||
$(dist_dir)/cheat-linux-arm5:
|
||||
GOARCH=arm GOOS=linux GOARM=5 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm6
|
||||
$(dist_dir)/cheat-linux-arm6: prepare
|
||||
$(dist_dir)/cheat-linux-arm6:
|
||||
GOARCH=arm GOOS=linux GOARM=6 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm7
|
||||
$(dist_dir)/cheat-linux-arm7: prepare
|
||||
$(dist_dir)/cheat-linux-arm7:
|
||||
GOARCH=arm GOOS=linux GOARM=7 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-linux-arm64
|
||||
$(dist_dir)/cheat-linux-arm64: prepare
|
||||
$(dist_dir)/cheat-linux-arm64:
|
||||
GOARCH=arm64 GOOS=linux \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-netbsd-amd64
|
||||
$(dist_dir)/cheat-netbsd-amd64: prepare
|
||||
$(dist_dir)/cheat-netbsd-amd64:
|
||||
GOARCH=amd64 GOOS=netbsd \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-openbsd-amd64
|
||||
$(dist_dir)/cheat-openbsd-amd64: prepare
|
||||
$(dist_dir)/cheat-openbsd-amd64:
|
||||
GOARCH=amd64 GOOS=openbsd \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-plan9-amd64
|
||||
$(dist_dir)/cheat-plan9-amd64: prepare
|
||||
$(dist_dir)/cheat-plan9-amd64:
|
||||
GOARCH=amd64 GOOS=plan9 \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-solaris-amd64
|
||||
$(dist_dir)/cheat-solaris-amd64: prepare
|
||||
$(dist_dir)/cheat-solaris-amd64:
|
||||
GOARCH=amd64 GOOS=solaris \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(GZIP) $@ && chmod -x $@.gz
|
||||
|
||||
# cheat-windows-amd64
|
||||
$(dist_dir)/cheat-windows-amd64.exe: prepare
|
||||
$(dist_dir)/cheat-windows-amd64.exe:
|
||||
GOARCH=amd64 GOOS=windows \
|
||||
$(GO) build $(BUILD_FLAGS) -o $@ $(cmd_dir) && $(ZIP) $@.zip $@ -j
|
||||
|
||||
@@ -115,9 +127,9 @@ $(dist_dir)/cheat-windows-amd64.exe: prepare
|
||||
$(dist_dir):
|
||||
$(MKDIR) $(dist_dir)
|
||||
|
||||
.PHONY: generate
|
||||
generate:
|
||||
$(GO) generate $(cmd_dir)
|
||||
# .tmp
|
||||
.tmp:
|
||||
$(MKDIR) .tmp
|
||||
|
||||
## install: build and install cheat on your PATH
|
||||
.PHONY: install
|
||||
@@ -127,7 +139,8 @@ install: build
|
||||
## clean: remove compiled executables
|
||||
.PHONY: clean
|
||||
clean:
|
||||
$(RM) -f $(dist_dir)/* $(cmd_dir)/str_config.go $(cmd_dir)/str_usage.go
|
||||
$(RM) -f $(dist_dir)/*
|
||||
$(RM) -rf .tmp
|
||||
|
||||
## distclean: remove the tags file
|
||||
.PHONY: distclean
|
||||
@@ -138,7 +151,8 @@ distclean:
|
||||
## setup: install revive (linter) and scc (sloc tool)
|
||||
.PHONY: setup
|
||||
setup:
|
||||
GO111MODULE=off $(GO) get -u github.com/boyter/scc github.com/mgechev/revive
|
||||
$(GO) install github.com/boyter/scc@latest
|
||||
$(GO) install github.com/mgechev/revive@latest
|
||||
|
||||
## sloc: count "semantic lines of code"
|
||||
.PHONY: sloc
|
||||
@@ -162,6 +176,7 @@ vendor:
|
||||
$(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify
|
||||
|
||||
## vendor-update: update vendored dependencies
|
||||
.PHONY: vendor-update
|
||||
vendor-update:
|
||||
$(GO) get -t -u ./... && $(GO) mod vendor && $(GO) mod tidy && $(GO) mod verify
|
||||
|
||||
@@ -185,18 +200,70 @@ vet:
|
||||
test:
|
||||
$(GO) test ./...
|
||||
|
||||
## test-integration: run integration tests (requires network)
|
||||
.PHONY: test-integration
|
||||
test-integration:
|
||||
$(GO) test -tags=integration -count=1 ./...
|
||||
|
||||
## test-all: run all tests (unit and integration)
|
||||
.PHONY: test-all
|
||||
test-all: test test-integration
|
||||
|
||||
## test-fuzz: run quick fuzz tests for security-critical functions
|
||||
.PHONY: test-fuzz
|
||||
test-fuzz:
|
||||
@./build/fuzz.sh 15s
|
||||
|
||||
## test-fuzz-long: run extended fuzz tests (10 minutes each)
|
||||
.PHONY: test-fuzz-long
|
||||
test-fuzz-long:
|
||||
@./build/fuzz.sh 10m
|
||||
|
||||
## coverage: generate a test coverage report
|
||||
.PHONY: coverage
|
||||
coverage:
|
||||
$(GO) test ./... -coverprofile=$(TMPDIR)/cheat-coverage.out && \
|
||||
$(GO) tool cover -html=$(TMPDIR)/cheat-coverage.out
|
||||
coverage: .tmp
|
||||
$(GO) test ./... -coverprofile=.tmp/cheat-coverage.out && \
|
||||
$(GO) tool cover -html=.tmp/cheat-coverage.out -o .tmp/cheat-coverage.html && \
|
||||
echo "Coverage report generated: .tmp/cheat-coverage.html" && \
|
||||
(sensible-browser .tmp/cheat-coverage.html 2>/dev/null || \
|
||||
xdg-open .tmp/cheat-coverage.html 2>/dev/null || \
|
||||
open .tmp/cheat-coverage.html 2>/dev/null || \
|
||||
echo "Please open .tmp/cheat-coverage.html in your browser")
|
||||
|
||||
## coverage-text: show test coverage by function in terminal
|
||||
.PHONY: coverage-text
|
||||
coverage-text: .tmp
|
||||
$(GO) test ./... -coverprofile=.tmp/cheat-coverage.out && \
|
||||
$(GO) tool cover -func=.tmp/cheat-coverage.out | $(SORT) -k3 -n
|
||||
|
||||
## benchmark: run performance benchmarks
|
||||
.PHONY: benchmark
|
||||
benchmark: .tmp
|
||||
$(GO) test -tags=integration -bench=. -benchtime=10s -benchmem ./cmd/cheat | tee .tmp/benchmark-latest.txt && \
|
||||
$(RM) -f cheat.test
|
||||
|
||||
## benchmark-cpu: run benchmarks with CPU profiling
|
||||
.PHONY: benchmark-cpu
|
||||
benchmark-cpu: .tmp
|
||||
$(GO) test -tags=integration -bench=. -benchtime=10s -cpuprofile=.tmp/cpu.prof ./cmd/cheat && \
|
||||
$(RM) -f cheat.test && \
|
||||
echo "CPU profile saved to .tmp/cpu.prof" && \
|
||||
echo "View with: go tool pprof -http=:8080 .tmp/cpu.prof"
|
||||
|
||||
## benchmark-mem: run benchmarks with memory profiling
|
||||
.PHONY: benchmark-mem
|
||||
benchmark-mem: .tmp
|
||||
$(GO) test -tags=integration -bench=. -benchtime=10s -benchmem -memprofile=.tmp/mem.prof ./cmd/cheat && \
|
||||
$(RM) -f cheat.test && \
|
||||
echo "Memory profile saved to .tmp/mem.prof" && \
|
||||
echo "View with: go tool pprof -http=:8080 .tmp/mem.prof"
|
||||
|
||||
## check: format, lint, vet, vendor, and run unit-tests
|
||||
.PHONY: check
|
||||
check: | vendor fmt lint vet test
|
||||
|
||||
.PHONY: prepare
|
||||
prepare: | clean $(dist_dir) generate vendor fmt lint vet test
|
||||
prepare: | clean $(dist_dir) vendor fmt lint vet test
|
||||
|
||||
## docker-setup: create a docker image for use during development
|
||||
.PHONY: docker-setup
|
||||
|
||||
@@ -117,7 +117,7 @@ cheat tar # file is named "tar"
|
||||
cheat foo/bar # file is named "bar", in a "foo" subdirectory
|
||||
```
|
||||
|
||||
Cheatsheet text may optionally be preceeded by a YAML frontmatter header that
|
||||
Cheatsheet text may optionally be preceded by a YAML frontmatter header that
|
||||
assigns tags and specifies syntax:
|
||||
|
||||
```
|
||||
|
||||
@@ -1,92 +0,0 @@
|
||||
//go:build ignore
|
||||
// +build ignore
|
||||
|
||||
// This script embeds `docopt.txt and `conf.yml` into the binary during at
|
||||
// build time.
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io/ioutil"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
// get the cwd
|
||||
cwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// get the project root
|
||||
root, err := filepath.Abs(cwd + "../../../")
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// specify template file information
|
||||
type file struct {
|
||||
In string
|
||||
Out string
|
||||
Method string
|
||||
}
|
||||
|
||||
// enumerate the template files to process
|
||||
files := []file{
|
||||
file{
|
||||
In: "cmd/cheat/docopt.txt",
|
||||
Out: "cmd/cheat/str_usage.go",
|
||||
Method: "usage"},
|
||||
file{
|
||||
In: "configs/conf.yml",
|
||||
Out: "cmd/cheat/str_config.go",
|
||||
Method: "configs"},
|
||||
}
|
||||
|
||||
// iterate over each static file
|
||||
for _, file := range files {
|
||||
|
||||
// delete the outfile
|
||||
os.Remove(filepath.Join(root, file.Out))
|
||||
|
||||
// read the static template
|
||||
bytes, err := ioutil.ReadFile(filepath.Join(root, file.In))
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// render the template
|
||||
data := template(file.Method, string(bytes))
|
||||
|
||||
// write the file to the specified outpath
|
||||
spath := filepath.Join(root, file.Out)
|
||||
err = ioutil.WriteFile(spath, []byte(data), 0644)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// template packages the
|
||||
func template(method string, body string) string {
|
||||
|
||||
// specify the template string
|
||||
t := `package main
|
||||
|
||||
// Code generated .* DO NOT EDIT.
|
||||
|
||||
import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
func %s() string {
|
||||
return strings.TrimSpace(%s)
|
||||
}
|
||||
`
|
||||
|
||||
return fmt.Sprintf(t, method, "`"+body+"`")
|
||||
}
|
||||
37
build/fuzz.sh
Executable file
37
build/fuzz.sh
Executable file
@@ -0,0 +1,37 @@
|
||||
#!/bin/bash
|
||||
#
|
||||
# Run fuzz tests for cheat
|
||||
# Usage: ./scripts/fuzz.sh [duration]
|
||||
#
|
||||
# Note: Go's fuzzer will fail immediately if it finds a known failing input
|
||||
# in the corpus (testdata/fuzz/*). This is by design - it ensures you fix
|
||||
# known bugs before searching for new ones. To see failing inputs:
|
||||
# ls internal/*/testdata/fuzz/*/
|
||||
#
|
||||
|
||||
set -e
|
||||
|
||||
DURATION="${1:-15s}"
|
||||
|
||||
# Define fuzz tests: "TestName:Package:Description"
|
||||
TESTS=(
|
||||
"FuzzParse:./internal/sheet:YAML frontmatter parsing"
|
||||
"FuzzValidateSheetName:./internal/cheatpath:sheet name validation (path traversal protection)"
|
||||
"FuzzSearchRegex:./internal/sheet:regex search operations"
|
||||
"FuzzSearchCatastrophicBacktracking:./internal/sheet:catastrophic backtracking"
|
||||
"FuzzTagged:./internal/sheet:tag matching with malicious input"
|
||||
"FuzzFilter:./internal/sheets:tag filtering operations"
|
||||
"FuzzTags:./internal/sheets:tag aggregation and sorting"
|
||||
)
|
||||
|
||||
echo "Running fuzz tests ($DURATION each)..."
|
||||
echo
|
||||
|
||||
for i in "${!TESTS[@]}"; do
|
||||
IFS=':' read -r test_name package description <<< "${TESTS[$i]}"
|
||||
echo "$((i+1)). Testing $description..."
|
||||
go test -fuzz="^${test_name}$" -fuzztime="$DURATION" "$package"
|
||||
echo
|
||||
done
|
||||
|
||||
echo "All fuzz tests passed!"
|
||||
@@ -17,6 +17,12 @@ func cmdEdit(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
cheatsheet := opts["--edit"].(string)
|
||||
|
||||
// validate the cheatsheet name
|
||||
if err := cheatpath.ValidateSheetName(cheatsheet); err != nil {
|
||||
fmt.Fprintf(os.Stderr, "invalid cheatsheet name: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// load the cheatsheets
|
||||
cheatsheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
|
||||
@@ -5,15 +5,22 @@ import (
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/cheat/cheat/internal/cheatpath"
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
"github.com/cheat/cheat/internal/sheets"
|
||||
)
|
||||
|
||||
// cmdRemove opens a cheatsheet for editing (or creates it if it doesn't exist).
|
||||
// cmdRemove removes (deletes) a cheatsheet.
|
||||
func cmdRemove(opts map[string]interface{}, conf config.Config) {
|
||||
|
||||
cheatsheet := opts["--rm"].(string)
|
||||
|
||||
// validate the cheatsheet name
|
||||
if err := cheatpath.ValidateSheetName(cheatsheet); err != nil {
|
||||
fmt.Fprintf(os.Stderr, "invalid cheatsheet name: %v\n", err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// load the cheatsheets
|
||||
cheatsheets, err := sheets.Load(conf.Cheatpaths)
|
||||
if err != nil {
|
||||
|
||||
@@ -31,6 +31,21 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
|
||||
)
|
||||
}
|
||||
|
||||
// prepare the search pattern
|
||||
pattern := "(?i)" + phrase
|
||||
|
||||
// unless --regex is provided, in which case we pass the regex unaltered
|
||||
if opts["--regex"] == true {
|
||||
pattern = phrase
|
||||
}
|
||||
|
||||
// compile the regex once, outside the loop
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to compile regexp: %s, %v\n", pattern, err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// iterate over each cheatpath
|
||||
out := ""
|
||||
for _, pathcheats := range cheatsheets {
|
||||
@@ -44,21 +59,6 @@ func cmdSearch(opts map[string]interface{}, conf config.Config) {
|
||||
continue
|
||||
}
|
||||
|
||||
// assume that we want to perform a case-insensitive search for <phrase>
|
||||
pattern := "(?i)" + phrase
|
||||
|
||||
// unless --regex is provided, in which case we pass the regex unaltered
|
||||
if opts["--regex"] == true {
|
||||
pattern = phrase
|
||||
}
|
||||
|
||||
// compile the regex
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
fmt.Fprintf(os.Stderr, "failed to compile regexp: %s, %v\n", pattern, err)
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// `Search` will return text entries that match the search terms.
|
||||
// We're using it here to overwrite the prior cheatsheet Text,
|
||||
// filtering it to only what is relevant.
|
||||
|
||||
73
cmd/cheat/config.go
Normal file
73
cmd/cheat/config.go
Normal file
@@ -0,0 +1,73 @@
|
||||
package main
|
||||
|
||||
// configs returns the default configuration template
|
||||
func configs() string {
|
||||
return `---
|
||||
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
|
||||
editor: EDITOR_PATH
|
||||
|
||||
# Should 'cheat' always colorize output?
|
||||
colorize: false
|
||||
|
||||
# Which 'chroma' colorscheme should be applied to the output?
|
||||
# Options are available here:
|
||||
# https://github.com/alecthomas/chroma/tree/master/styles
|
||||
style: monokai
|
||||
|
||||
# Which 'chroma' "formatter" should be applied?
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal256
|
||||
|
||||
# Through which pager should output be piped?
|
||||
# 'less -FRX' is recommended on Unix systems
|
||||
# 'more' is recommended on Windows
|
||||
pager: PAGER_PATH
|
||||
|
||||
# The paths at which cheatsheets are available. Tags associated with a cheatpath
|
||||
# are automatically attached to all cheatsheets residing on that path.
|
||||
#
|
||||
# Whenever cheatsheets share the same title (like 'tar'), the most local
|
||||
# cheatsheets (those which come later in this file) take precedence over the
|
||||
# less local sheets. This allows you to create your own "overides" for
|
||||
# "upstream" cheatsheets.
|
||||
#
|
||||
# But what if you want to view the "upstream" cheatsheets instead of your own?
|
||||
# Cheatsheets may be filtered by 'tags' in combination with the '--tag' flag.
|
||||
#
|
||||
# Example: 'cheat tar --tag=community' will display the 'tar' cheatsheet that
|
||||
# is tagged as 'community' rather than your own.
|
||||
#
|
||||
# Paths that come earlier are considered to be the most "global", and paths
|
||||
# that come later are considered to be the most "local". The most "local" paths
|
||||
# take precedence.
|
||||
#
|
||||
# See: https://github.com/cheat/cheat/blob/master/doc/cheat.1.md#cheatpaths
|
||||
cheatpaths:
|
||||
|
||||
# Cheatsheets that are tagged "personal" are stored here by default:
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# Cheatsheets that are tagged "work" are stored here by default:
|
||||
- name: work
|
||||
path: WORK_PATH
|
||||
tags: [ work ]
|
||||
readonly: false
|
||||
|
||||
# Community cheatsheets are stored here by default:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# You can also use glob patterns to automatically load cheatsheets from all
|
||||
# directories that match.
|
||||
#
|
||||
# Example: overload cheatsheets for projects under ~/src/github.com/example/*/
|
||||
#- name: example-projects
|
||||
# path: ~/src/github.com/example/**/.cheat
|
||||
# tags: [ example ]
|
||||
# readonly: true`
|
||||
}
|
||||
@@ -1,59 +0,0 @@
|
||||
Usage:
|
||||
cheat [options] [<cheatsheet>]
|
||||
|
||||
Options:
|
||||
--init Write a default config file to stdout
|
||||
-a --all Search among all cheatpaths
|
||||
-c --colorize Colorize output
|
||||
-d --directories List cheatsheet directories
|
||||
-e --edit=<cheatsheet> Edit <cheatsheet>
|
||||
-l --list List cheatsheets
|
||||
-p --path=<name> Return only sheets found on cheatpath <name>
|
||||
-r --regex Treat search <phrase> as a regex
|
||||
-s --search=<phrase> Search cheatsheets for <phrase>
|
||||
-t --tag=<tag> Return only sheets matching <tag>
|
||||
-T --tags List all tags in use
|
||||
-v --version Print the version number
|
||||
--rm=<cheatsheet> Remove (delete) <cheatsheet>
|
||||
--conf Display the config file path
|
||||
|
||||
Examples:
|
||||
|
||||
To initialize a config file:
|
||||
mkdir -p ~/.config/cheat && cheat --init > ~/.config/cheat/conf.yml
|
||||
|
||||
To view the tar cheatsheet:
|
||||
cheat tar
|
||||
|
||||
To edit (or create) the foo cheatsheet:
|
||||
cheat -e foo
|
||||
|
||||
To edit (or create) the foo/bar cheatsheet on the "work" cheatpath:
|
||||
cheat -p work -e foo/bar
|
||||
|
||||
To view all cheatsheet directories:
|
||||
cheat -d
|
||||
|
||||
To list all available cheatsheets:
|
||||
cheat -l
|
||||
|
||||
To list all cheatsheets whose titles match "apt":
|
||||
cheat -l apt
|
||||
|
||||
To list all tags in use:
|
||||
cheat -T
|
||||
|
||||
To list available cheatsheets that are tagged as "personal":
|
||||
cheat -l -t personal
|
||||
|
||||
To search for "ssh" among all cheatsheets, and colorize matches:
|
||||
cheat -c -s ssh
|
||||
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -c -r -s '(?:[0-9]{1,3}\.){3}[0-9]{1,3}'
|
||||
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
cheat --rm foo/bar
|
||||
|
||||
To view the configuration file path:
|
||||
cheat --conf
|
||||
@@ -1,8 +1,6 @@
|
||||
// Package main serves as the executable entrypoint.
|
||||
package main
|
||||
|
||||
//go:generate go run ../../build/embed.go
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
@@ -17,7 +15,7 @@ import (
|
||||
"github.com/cheat/cheat/internal/installer"
|
||||
)
|
||||
|
||||
const version = "4.4.2"
|
||||
const version = "4.5.0"
|
||||
|
||||
func main() {
|
||||
|
||||
@@ -45,6 +43,7 @@ func main() {
|
||||
// read the envvars into a map of strings
|
||||
envvars := map[string]string{}
|
||||
for _, e := range os.Environ() {
|
||||
// os.Environ() guarantees "key=value" format (see ADR-002)
|
||||
pair := strings.SplitN(e, "=", 2)
|
||||
if runtime.GOOS == "windows" {
|
||||
pair[0] = strings.ToUpper(pair[0])
|
||||
|
||||
216
cmd/cheat/path_traversal_integration_test.go
Normal file
216
cmd/cheat/path_traversal_integration_test.go
Normal file
@@ -0,0 +1,216 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestPathTraversalIntegration tests that the cheat binary properly blocks
|
||||
// path traversal attempts when invoked as a subprocess.
|
||||
func TestPathTraversalIntegration(t *testing.T) {
|
||||
// Build the cheat binary
|
||||
binPath := filepath.Join(t.TempDir(), "cheat_test")
|
||||
if output, err := exec.Command("go", "build", "-o", binPath, ".").CombinedOutput(); err != nil {
|
||||
t.Fatalf("Failed to build cheat: %v\nOutput: %s", err, output)
|
||||
}
|
||||
|
||||
// Set up test environment
|
||||
testDir := t.TempDir()
|
||||
sheetsDir := filepath.Join(testDir, "sheets")
|
||||
os.MkdirAll(sheetsDir, 0755)
|
||||
|
||||
// Create config
|
||||
config := fmt.Sprintf(`---
|
||||
editor: echo
|
||||
colorize: false
|
||||
pager: cat
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: %s
|
||||
readonly: false
|
||||
`, sheetsDir)
|
||||
|
||||
configPath := filepath.Join(testDir, "config.yml")
|
||||
if err := os.WriteFile(configPath, []byte(config), 0644); err != nil {
|
||||
t.Fatalf("Failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Test table
|
||||
tests := []struct {
|
||||
name string
|
||||
command []string
|
||||
wantFail bool
|
||||
wantMsg string
|
||||
}{
|
||||
// Blocked patterns
|
||||
{
|
||||
name: "block parent traversal edit",
|
||||
command: []string{"--edit", "../evil"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block absolute path edit",
|
||||
command: []string{"--edit", "/etc/passwd"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot be an absolute path",
|
||||
},
|
||||
{
|
||||
name: "block home dir edit",
|
||||
command: []string{"--edit", "~/.ssh/config"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot start with '~'",
|
||||
},
|
||||
{
|
||||
name: "block parent traversal remove",
|
||||
command: []string{"--rm", "../evil"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block complex traversal",
|
||||
command: []string{"--edit", "foo/../../bar"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block just dots",
|
||||
command: []string{"--edit", ".."},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot contain '..'",
|
||||
},
|
||||
{
|
||||
name: "block empty name",
|
||||
command: []string{"--edit", ""},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot be empty",
|
||||
},
|
||||
// Allowed patterns
|
||||
{
|
||||
name: "allow simple name",
|
||||
command: []string{"--edit", "docker"},
|
||||
wantFail: false,
|
||||
},
|
||||
{
|
||||
name: "allow nested name",
|
||||
command: []string{"--edit", "lang/go"},
|
||||
wantFail: false,
|
||||
},
|
||||
{
|
||||
name: "block hidden file",
|
||||
command: []string{"--edit", ".gitignore"},
|
||||
wantFail: true,
|
||||
wantMsg: "cannot start with '.'",
|
||||
},
|
||||
{
|
||||
name: "allow current dir",
|
||||
command: []string{"--edit", "./local"},
|
||||
wantFail: false,
|
||||
},
|
||||
}
|
||||
|
||||
// Run tests
|
||||
for _, tc := range tests {
|
||||
t.Run(tc.name, func(t *testing.T) {
|
||||
cmd := exec.Command(binPath, tc.command...)
|
||||
cmd.Env = []string{
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configPath),
|
||||
fmt.Sprintf("HOME=%s", testDir),
|
||||
}
|
||||
output, err := cmd.CombinedOutput()
|
||||
|
||||
if tc.wantFail {
|
||||
if err == nil {
|
||||
t.Errorf("Expected failure but command succeeded. Output: %s", output)
|
||||
}
|
||||
if !strings.Contains(string(output), "invalid cheatsheet name") {
|
||||
t.Errorf("Expected 'invalid cheatsheet name' error, got: %s", output)
|
||||
}
|
||||
if tc.wantMsg != "" && !strings.Contains(string(output), tc.wantMsg) {
|
||||
t.Errorf("Expected message %q in output, got: %s", tc.wantMsg, output)
|
||||
}
|
||||
} else {
|
||||
// Command might fail for other reasons (e.g., editor not found)
|
||||
// but should NOT fail with "invalid cheatsheet name"
|
||||
if strings.Contains(string(output), "invalid cheatsheet name") {
|
||||
t.Errorf("Command incorrectly blocked. Output: %s", output)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestPathTraversalRealWorld tests with more realistic scenarios
|
||||
func TestPathTraversalRealWorld(t *testing.T) {
|
||||
// This test ensures our protection works with actual file operations
|
||||
|
||||
// Build cheat
|
||||
binPath := filepath.Join(t.TempDir(), "cheat_test")
|
||||
if output, err := exec.Command("go", "build", "-o", binPath, ".").CombinedOutput(); err != nil {
|
||||
t.Fatalf("Failed to build: %v\n%s", err, output)
|
||||
}
|
||||
|
||||
// Create test structure
|
||||
testRoot := t.TempDir()
|
||||
sheetsDir := filepath.Join(testRoot, "cheatsheets")
|
||||
secretDir := filepath.Join(testRoot, "secrets")
|
||||
os.MkdirAll(sheetsDir, 0755)
|
||||
os.MkdirAll(secretDir, 0755)
|
||||
|
||||
// Create a "secret" file that should not be accessible
|
||||
secretFile := filepath.Join(secretDir, "secret.txt")
|
||||
os.WriteFile(secretFile, []byte("SECRET DATA"), 0644)
|
||||
|
||||
// Create config using vim in non-interactive mode
|
||||
config := fmt.Sprintf(`---
|
||||
editor: vim -u NONE -n --cmd "set noswapfile" --cmd "wq"
|
||||
colorize: false
|
||||
pager: cat
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: %s
|
||||
readonly: false
|
||||
`, sheetsDir)
|
||||
|
||||
configPath := filepath.Join(testRoot, "config.yml")
|
||||
os.WriteFile(configPath, []byte(config), 0644)
|
||||
|
||||
// Test 1: Try to edit a file outside cheatsheets using traversal
|
||||
cmd := exec.Command(binPath, "--edit", "../secrets/secret")
|
||||
cmd.Env = []string{
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configPath),
|
||||
fmt.Sprintf("HOME=%s", testRoot),
|
||||
}
|
||||
output, err := cmd.CombinedOutput()
|
||||
|
||||
if err == nil || !strings.Contains(string(output), "invalid cheatsheet name") {
|
||||
t.Errorf("Path traversal was not blocked! Output: %s", output)
|
||||
}
|
||||
|
||||
// Test 2: Verify the secret file is still intact
|
||||
content, _ := os.ReadFile(secretFile)
|
||||
if string(content) != "SECRET DATA" {
|
||||
t.Errorf("Secret file was modified!")
|
||||
}
|
||||
|
||||
// Test 3: Verify no files were created outside sheets directory
|
||||
err = filepath.Walk(testRoot, func(path string, info os.FileInfo, err error) error {
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
if !info.IsDir() &&
|
||||
path != configPath &&
|
||||
path != secretFile &&
|
||||
!strings.HasPrefix(path, sheetsDir) {
|
||||
t.Errorf("File created outside allowed directory: %s", path)
|
||||
}
|
||||
return nil
|
||||
})
|
||||
if err != nil {
|
||||
t.Errorf("Walk error: %v", err)
|
||||
}
|
||||
}
|
||||
209
cmd/cheat/search_bench_test.go
Normal file
209
cmd/cheat/search_bench_test.go
Normal file
@@ -0,0 +1,209 @@
|
||||
//go:build integration
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/go-git/go-git/v5"
|
||||
"github.com/go-git/go-git/v5/plumbing"
|
||||
)
|
||||
|
||||
// BenchmarkSearchCommand benchmarks the actual cheat search command
|
||||
func BenchmarkSearchCommand(b *testing.B) {
|
||||
// Build the cheat binary in .tmp (using absolute path)
|
||||
rootDir, err := filepath.Abs(filepath.Join("..", ".."))
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to get root dir: %v", err)
|
||||
}
|
||||
tmpDir := filepath.Join(rootDir, ".tmp", "bench-test")
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
b.Fatalf("Failed to create temp dir: %v", err)
|
||||
}
|
||||
|
||||
cheatBin := filepath.Join(tmpDir, "cheat-bench")
|
||||
|
||||
// Clean up the binary when done
|
||||
b.Cleanup(func() {
|
||||
os.Remove(cheatBin)
|
||||
})
|
||||
|
||||
cmd := exec.Command("go", "build", "-o", cheatBin, "./cmd/cheat")
|
||||
cmd.Dir = rootDir
|
||||
if output, err := cmd.CombinedOutput(); err != nil {
|
||||
b.Fatalf("Failed to build cheat: %v\nOutput: %s", err, output)
|
||||
}
|
||||
|
||||
// Set up test environment in .tmp
|
||||
configDir := filepath.Join(tmpDir, "config")
|
||||
cheatsheetDir := filepath.Join(configDir, "cheatsheets", "community")
|
||||
|
||||
// Clone community cheatsheets (or reuse if already exists)
|
||||
if _, err := os.Stat(cheatsheetDir); os.IsNotExist(err) {
|
||||
b.Logf("Cloning community cheatsheets to %s...", cheatsheetDir)
|
||||
_, err := git.PlainClone(cheatsheetDir, false, &git.CloneOptions{
|
||||
URL: "https://github.com/cheat/cheatsheets.git",
|
||||
Depth: 1,
|
||||
SingleBranch: true,
|
||||
ReferenceName: plumbing.ReferenceName("refs/heads/master"),
|
||||
Progress: nil,
|
||||
})
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to clone cheatsheets: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Create a minimal config file
|
||||
configFile := filepath.Join(configDir, "conf.yml")
|
||||
configContent := fmt.Sprintf(`---
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: %s
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
`, cheatsheetDir)
|
||||
|
||||
if err := os.MkdirAll(configDir, 0755); err != nil {
|
||||
b.Fatalf("Failed to create config dir: %v", err)
|
||||
}
|
||||
if err := os.WriteFile(configFile, []byte(configContent), 0644); err != nil {
|
||||
b.Fatalf("Failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Set environment to use our config
|
||||
env := append(os.Environ(),
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configFile),
|
||||
)
|
||||
|
||||
// Define test cases
|
||||
testCases := []struct {
|
||||
name string
|
||||
args []string
|
||||
}{
|
||||
{"SimpleSearch", []string{"-s", "echo"}},
|
||||
{"RegexSearch", []string{"-r", "-s", "^#.*example"}},
|
||||
{"ColorizedSearch", []string{"-c", "-s", "grep"}},
|
||||
{"ComplexRegex", []string{"-r", "-s", "(git|hg|svn)\\s+(add|commit|push)"}},
|
||||
{"AllCheatpaths", []string{"-a", "-s", "list"}},
|
||||
}
|
||||
|
||||
// Warm up - run once to ensure everything is loaded
|
||||
warmupCmd := exec.Command(cheatBin, "-l")
|
||||
warmupCmd.Env = env
|
||||
warmupCmd.Run()
|
||||
|
||||
// Run benchmarks
|
||||
for _, tc := range testCases {
|
||||
b.Run(tc.name, func(b *testing.B) {
|
||||
// Reset timer to exclude setup
|
||||
b.ResetTimer()
|
||||
|
||||
for i := 0; i < b.N; i++ {
|
||||
cmd := exec.Command(cheatBin, tc.args...)
|
||||
cmd.Env = env
|
||||
|
||||
// Capture output to prevent spamming
|
||||
var stdout, stderr bytes.Buffer
|
||||
cmd.Stdout = &stdout
|
||||
cmd.Stderr = &stderr
|
||||
|
||||
start := time.Now()
|
||||
err := cmd.Run()
|
||||
elapsed := time.Since(start)
|
||||
|
||||
if err != nil {
|
||||
b.Fatalf("Command failed: %v\nStderr: %s", err, stderr.String())
|
||||
}
|
||||
|
||||
// Report custom metric
|
||||
b.ReportMetric(float64(elapsed.Nanoseconds())/1e6, "ms/op")
|
||||
|
||||
// Ensure we got some results
|
||||
if stdout.Len() == 0 {
|
||||
b.Fatal("No output from search")
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// BenchmarkListCommand benchmarks the list command for comparison
|
||||
func BenchmarkListCommand(b *testing.B) {
|
||||
// Build the cheat binary in .tmp (using absolute path)
|
||||
rootDir, err := filepath.Abs(filepath.Join("..", ".."))
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to get root dir: %v", err)
|
||||
}
|
||||
tmpDir := filepath.Join(rootDir, ".tmp", "bench-test")
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
b.Fatalf("Failed to create temp dir: %v", err)
|
||||
}
|
||||
|
||||
cheatBin := filepath.Join(tmpDir, "cheat-bench")
|
||||
|
||||
// Clean up the binary when done
|
||||
b.Cleanup(func() {
|
||||
os.Remove(cheatBin)
|
||||
})
|
||||
|
||||
cmd := exec.Command("go", "build", "-o", cheatBin, "./cmd/cheat")
|
||||
cmd.Dir = rootDir
|
||||
if output, err := cmd.CombinedOutput(); err != nil {
|
||||
b.Fatalf("Failed to build cheat: %v\nOutput: %s", err, output)
|
||||
}
|
||||
|
||||
// Set up test environment (simplified - reuse if possible)
|
||||
configDir := filepath.Join(tmpDir, "config")
|
||||
cheatsheetDir := filepath.Join(configDir, "cheatsheets", "community")
|
||||
|
||||
// Check if we need to clone
|
||||
if _, err := os.Stat(cheatsheetDir); os.IsNotExist(err) {
|
||||
_, err := git.PlainClone(cheatsheetDir, false, &git.CloneOptions{
|
||||
URL: "https://github.com/cheat/cheatsheets.git",
|
||||
Depth: 1,
|
||||
SingleBranch: true,
|
||||
ReferenceName: plumbing.ReferenceName("refs/heads/master"),
|
||||
Progress: nil,
|
||||
})
|
||||
if err != nil {
|
||||
b.Fatalf("Failed to clone cheatsheets: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Create config
|
||||
configFile := filepath.Join(configDir, "conf.yml")
|
||||
configContent := fmt.Sprintf(`---
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: %s
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
`, cheatsheetDir)
|
||||
|
||||
os.MkdirAll(configDir, 0755)
|
||||
os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
|
||||
env := append(os.Environ(),
|
||||
fmt.Sprintf("CHEAT_CONFIG_PATH=%s", configFile),
|
||||
)
|
||||
|
||||
b.ResetTimer()
|
||||
|
||||
for i := 0; i < b.N; i++ {
|
||||
cmd := exec.Command(cheatBin, "-l")
|
||||
cmd.Env = env
|
||||
|
||||
var stdout bytes.Buffer
|
||||
cmd.Stdout = &stdout
|
||||
|
||||
if err := cmd.Run(); err != nil {
|
||||
b.Fatalf("Command failed: %v", err)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,93 +0,0 @@
|
||||
package main
|
||||
|
||||
// Code generated .* DO NOT EDIT.
|
||||
|
||||
import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
func configs() string {
|
||||
return strings.TrimSpace(`---
|
||||
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
|
||||
editor: EDITOR_PATH
|
||||
|
||||
# Should 'cheat' always colorize output?
|
||||
colorize: false
|
||||
|
||||
# Which 'chroma' colorscheme should be applied to the output?
|
||||
# Options are available here:
|
||||
# https://github.com/alecthomas/chroma/tree/master/styles
|
||||
style: monokai
|
||||
|
||||
# Which 'chroma' "formatter" should be applied?
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal256
|
||||
|
||||
# Through which pager should output be piped?
|
||||
# 'less -FRX' is recommended on Unix systems
|
||||
# 'more' is recommended on Windows
|
||||
pager: PAGER_PATH
|
||||
|
||||
# Cheatpaths are paths at which cheatsheets are available on your local
|
||||
# filesystem.
|
||||
#
|
||||
# It is useful to sort cheatsheets into different cheatpaths for organizational
|
||||
# purposes. For example, you might want one cheatpath for community
|
||||
# cheatsheets, one for personal cheatsheets, one for cheatsheets pertaining to
|
||||
# your day job, one for code snippets, etc.
|
||||
#
|
||||
# Cheatpaths are scoped, such that more "local" cheatpaths take priority over
|
||||
# more "global" cheatpaths. (The most global cheatpath is listed first in this
|
||||
# file; the most local is listed last.) For example, if there is a 'tar'
|
||||
# cheatsheet on both global and local paths, you'll be presented with the local
|
||||
# one by default. ('cheat -p' can be used to view cheatsheets from alternative
|
||||
# cheatpaths.)
|
||||
#
|
||||
# Cheatpaths can also be tagged as "read only". This instructs cheat not to
|
||||
# automatically create cheatsheets on a read-only cheatpath. Instead, when you
|
||||
# would like to edit a read-only cheatsheet using 'cheat -e', cheat will
|
||||
# perform a copy-on-write of that cheatsheet from a read-only cheatpath to a
|
||||
# writeable cheatpath.
|
||||
#
|
||||
# This is very useful when you would like to maintain, for example, a
|
||||
# "pristine" repository of community cheatsheets on one cheatpath, and an
|
||||
# editable personal reponsity of cheatsheets on another cheatpath.
|
||||
#
|
||||
# Cheatpaths can be also configured to automatically apply tags to cheatsheets
|
||||
# on certain paths, which can be useful for querying purposes.
|
||||
# Example: 'cheat -t work jenkins'.
|
||||
#
|
||||
# Community cheatsheets must be installed separately, though you may have
|
||||
# downloaded them automatically when installing 'cheat'. If not, you may
|
||||
# download them here:
|
||||
#
|
||||
# https://github.com/cheat/cheatsheets
|
||||
cheatpaths:
|
||||
# Cheatpath properties mean the following:
|
||||
# 'name': the name of the cheatpath (view with 'cheat -d', filter with 'cheat -p')
|
||||
# 'path': the filesystem path of the cheatsheet directory (view with 'cheat -d')
|
||||
# 'tags': tags that should be automatically applied to sheets on this path
|
||||
# 'readonly': shall user-created ('cheat -e') cheatsheets be saved here?
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# If you have personalized cheatsheets, list them last. They will take
|
||||
# precedence over the more global cheatsheets.
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# While it requires no configuration here, it's also worth noting that
|
||||
# cheat will automatically append directories named '.cheat' within the
|
||||
# current working directory to the 'cheatpath'. This can be very useful if
|
||||
# you'd like to closely associate cheatsheets with, for example, a directory
|
||||
# containing source code.
|
||||
#
|
||||
# Such "directory-scoped" cheatsheets will be treated as the most "local"
|
||||
# cheatsheets, and will override less "local" cheatsheets. Similarly,
|
||||
# directory-scoped cheatsheets will always be editable ('readonly: false').
|
||||
`)
|
||||
}
|
||||
@@ -1,13 +1,8 @@
|
||||
package main
|
||||
|
||||
// Code generated .* DO NOT EDIT.
|
||||
|
||||
import (
|
||||
"strings"
|
||||
)
|
||||
|
||||
// usage returns the usage text for the cheat command
|
||||
func usage() string {
|
||||
return strings.TrimSpace(`Usage:
|
||||
return `Usage:
|
||||
cheat [options] [<cheatsheet>]
|
||||
|
||||
Options:
|
||||
@@ -65,6 +60,5 @@ Examples:
|
||||
cheat --rm foo/bar
|
||||
|
||||
To view the configuration file path:
|
||||
cheat --conf
|
||||
`)
|
||||
cheat --conf`
|
||||
}
|
||||
@@ -1,82 +0,0 @@
|
||||
---
|
||||
# The editor to use with 'cheat -e <sheet>'. Defaults to $EDITOR or $VISUAL.
|
||||
editor: EDITOR_PATH
|
||||
|
||||
# Should 'cheat' always colorize output?
|
||||
colorize: false
|
||||
|
||||
# Which 'chroma' colorscheme should be applied to the output?
|
||||
# Options are available here:
|
||||
# https://github.com/alecthomas/chroma/tree/master/styles
|
||||
style: monokai
|
||||
|
||||
# Which 'chroma' "formatter" should be applied?
|
||||
# One of: "terminal", "terminal256", "terminal16m"
|
||||
formatter: terminal256
|
||||
|
||||
# Through which pager should output be piped?
|
||||
# 'less -FRX' is recommended on Unix systems
|
||||
# 'more' is recommended on Windows
|
||||
pager: PAGER_PATH
|
||||
|
||||
# Cheatpaths are paths at which cheatsheets are available on your local
|
||||
# filesystem.
|
||||
#
|
||||
# It is useful to sort cheatsheets into different cheatpaths for organizational
|
||||
# purposes. For example, you might want one cheatpath for community
|
||||
# cheatsheets, one for personal cheatsheets, one for cheatsheets pertaining to
|
||||
# your day job, one for code snippets, etc.
|
||||
#
|
||||
# Cheatpaths are scoped, such that more "local" cheatpaths take priority over
|
||||
# more "global" cheatpaths. (The most global cheatpath is listed first in this
|
||||
# file; the most local is listed last.) For example, if there is a 'tar'
|
||||
# cheatsheet on both global and local paths, you'll be presented with the local
|
||||
# one by default. ('cheat -p' can be used to view cheatsheets from alternative
|
||||
# cheatpaths.)
|
||||
#
|
||||
# Cheatpaths can also be tagged as "read only". This instructs cheat not to
|
||||
# automatically create cheatsheets on a read-only cheatpath. Instead, when you
|
||||
# would like to edit a read-only cheatsheet using 'cheat -e', cheat will
|
||||
# perform a copy-on-write of that cheatsheet from a read-only cheatpath to a
|
||||
# writeable cheatpath.
|
||||
#
|
||||
# This is very useful when you would like to maintain, for example, a
|
||||
# "pristine" repository of community cheatsheets on one cheatpath, and an
|
||||
# editable personal reponsity of cheatsheets on another cheatpath.
|
||||
#
|
||||
# Cheatpaths can be also configured to automatically apply tags to cheatsheets
|
||||
# on certain paths, which can be useful for querying purposes.
|
||||
# Example: 'cheat -t work jenkins'.
|
||||
#
|
||||
# Community cheatsheets must be installed separately, though you may have
|
||||
# downloaded them automatically when installing 'cheat'. If not, you may
|
||||
# download them here:
|
||||
#
|
||||
# https://github.com/cheat/cheatsheets
|
||||
cheatpaths:
|
||||
# Cheatpath properties mean the following:
|
||||
# 'name': the name of the cheatpath (view with 'cheat -d', filter with 'cheat -p')
|
||||
# 'path': the filesystem path of the cheatsheet directory (view with 'cheat -d')
|
||||
# 'tags': tags that should be automatically applied to sheets on this path
|
||||
# 'readonly': shall user-created ('cheat -e') cheatsheets be saved here?
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
|
||||
# If you have personalized cheatsheets, list them last. They will take
|
||||
# precedence over the more global cheatsheets.
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
|
||||
# While it requires no configuration here, it's also worth noting that
|
||||
# cheat will automatically append directories named '.cheat' within the
|
||||
# current working directory to the 'cheatpath'. This can be very useful if
|
||||
# you'd like to closely associate cheatsheets with, for example, a directory
|
||||
# containing source code.
|
||||
#
|
||||
# Such "directory-scoped" cheatsheets will be treated as the most "local"
|
||||
# cheatsheets, and will override less "local" cheatsheets. Similarly,
|
||||
# directory-scoped cheatsheets will always be editable ('readonly: false').
|
||||
169
doc/adr/001-path-traversal-protection.md
Normal file
169
doc/adr/001-path-traversal-protection.md
Normal file
@@ -0,0 +1,169 @@
|
||||
# ADR-001: Path Traversal Protection for Cheatsheet Names
|
||||
|
||||
Date: 2025-01-21
|
||||
|
||||
## Status
|
||||
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
|
||||
The `cheat` tool allows users to create, edit, and remove cheatsheets using commands like:
|
||||
- `cheat --edit <name>`
|
||||
- `cheat --rm <name>`
|
||||
|
||||
Without validation, a user could potentially provide malicious names like:
|
||||
- `../../../etc/passwd` (directory traversal)
|
||||
- `/etc/passwd` (absolute path)
|
||||
- `~/.ssh/authorized_keys` (home directory expansion)
|
||||
|
||||
While `cheat` is a local tool run by the user themselves (not a network service), path traversal could still lead to:
|
||||
1. Accidental file overwrites outside cheatsheet directories
|
||||
2. Confusion about where files are being created
|
||||
3. Potential security issues in shared environments
|
||||
|
||||
## Decision
|
||||
|
||||
We implemented input validation for cheatsheet names to prevent directory traversal attacks. The validation rejects names that:
|
||||
|
||||
1. Contain `..` (parent directory references)
|
||||
2. Are absolute paths (start with `/` on Unix)
|
||||
3. Start with `~` (home directory expansion)
|
||||
4. Are empty
|
||||
5. Start with `.` (hidden files - these are not displayed by cheat)
|
||||
|
||||
The validation is performed at the application layer before any file operations occur.
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### Validation Function
|
||||
|
||||
The validation is implemented in `internal/cheatpath/validate.go`:
|
||||
|
||||
```go
|
||||
func ValidateSheetName(name string) error {
|
||||
// Reject empty names
|
||||
if name == "" {
|
||||
return fmt.Errorf("cheatsheet name cannot be empty")
|
||||
}
|
||||
|
||||
// Reject names containing directory traversal
|
||||
if strings.Contains(name, "..") {
|
||||
return fmt.Errorf("cheatsheet name cannot contain '..'")
|
||||
}
|
||||
|
||||
// Reject absolute paths
|
||||
if filepath.IsAbs(name) {
|
||||
return fmt.Errorf("cheatsheet name cannot be an absolute path")
|
||||
}
|
||||
|
||||
// Reject names that start with ~ (home directory expansion)
|
||||
if strings.HasPrefix(name, "~") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '~'")
|
||||
}
|
||||
|
||||
// Reject hidden files (files that start with a dot)
|
||||
filename := filepath.Base(name)
|
||||
if strings.HasPrefix(filename, ".") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '.' (hidden files are not supported)")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
```
|
||||
|
||||
### Integration Points
|
||||
|
||||
The validation is called in:
|
||||
- `cmd/cheat/cmd_edit.go` - before creating or editing a cheatsheet
|
||||
- `cmd/cheat/cmd_remove.go` - before removing a cheatsheet
|
||||
|
||||
### Allowed Patterns
|
||||
|
||||
The following patterns are explicitly allowed:
|
||||
- Simple names: `docker`, `git`
|
||||
- Nested paths: `docker/compose`, `lang/go/slice`
|
||||
- Current directory references: `./mysheet`
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
|
||||
1. **Safety**: Prevents accidental or intentional file operations outside cheatsheet directories
|
||||
2. **Simplicity**: Validation happens early, before any file operations
|
||||
3. **User-friendly**: Clear error messages explain why a name was rejected
|
||||
4. **Performance**: Minimal overhead - simple string checks
|
||||
5. **Compatibility**: Doesn't break existing valid cheatsheet names
|
||||
|
||||
### Negative
|
||||
|
||||
1. **Limitation**: Users cannot use `..` in cheatsheet names even if legitimate
|
||||
2. **No symlink support**: Cannot create cheatsheets through symlinks outside the cheatpath
|
||||
|
||||
### Neutral
|
||||
|
||||
1. Uses Go's `filepath.IsAbs()` which handles platform differences (Windows vs Unix)
|
||||
2. No attempt to resolve or canonicalize paths - validation is purely syntactic
|
||||
|
||||
## Security Considerations
|
||||
|
||||
### Threat Model
|
||||
|
||||
`cheat` is a local command-line tool, not a network service. The primary threats are:
|
||||
- User error (accidentally overwriting important files)
|
||||
- Malicious scripts that invoke `cheat` with crafted arguments
|
||||
- Shared system scenarios where cheatsheets might be shared
|
||||
|
||||
### What This Protects Against
|
||||
|
||||
- Directory traversal using `../`
|
||||
- Absolute path access to system files
|
||||
- Shell expansion of `~` to home directory
|
||||
- Empty names that might cause unexpected behavior
|
||||
- Hidden files that wouldn't be displayed anyway
|
||||
|
||||
### What This Does NOT Protect Against
|
||||
|
||||
- Users with filesystem permissions can still directly edit any file
|
||||
- Symbolic links within the cheatpath pointing outside
|
||||
- Race conditions (TOCTOU) - though minimal risk for a local tool
|
||||
- Malicious content within cheatsheets themselves
|
||||
|
||||
## Testing
|
||||
|
||||
Comprehensive tests ensure the validation works correctly:
|
||||
|
||||
1. **Unit tests** (`internal/cheatpath/validate_test.go`) verify the validation logic
|
||||
2. **Integration tests** verify the actual binary blocks malicious inputs
|
||||
3. **No system files are accessed** during testing - all tests use isolated directories
|
||||
|
||||
Example test cases:
|
||||
```bash
|
||||
# These are blocked:
|
||||
cheat --edit "../../../etc/passwd"
|
||||
cheat --edit "/etc/passwd"
|
||||
cheat --edit "~/.ssh/config"
|
||||
cheat --rm ".."
|
||||
|
||||
# These are allowed:
|
||||
cheat --edit "docker"
|
||||
cheat --edit "docker/compose"
|
||||
cheat --edit "./local"
|
||||
```
|
||||
|
||||
## Alternative Approaches Considered
|
||||
|
||||
1. **Path resolution and verification**: Resolve the final path and check if it's within the cheatpath
|
||||
- Rejected: More complex, potential race conditions, platform-specific edge cases
|
||||
|
||||
2. **Chroot/sandbox**: Run file operations in a restricted environment
|
||||
- Rejected: Overkill for a local tool, platform compatibility issues
|
||||
|
||||
3. **Filename allowlist**: Only allow alphanumeric characters and specific symbols
|
||||
- Rejected: Too restrictive, would break existing cheatsheets with valid special characters
|
||||
|
||||
## References
|
||||
|
||||
- OWASP Path Traversal: https://owasp.org/www-community/attacks/Path_Traversal
|
||||
- CWE-22: Improper Limitation of a Pathname to a Restricted Directory
|
||||
- Go filepath package documentation: https://pkg.go.dev/path/filepath
|
||||
100
doc/adr/002-environment-variable-parsing.md
Normal file
100
doc/adr/002-environment-variable-parsing.md
Normal file
@@ -0,0 +1,100 @@
|
||||
# ADR-002: No Defensive Checks for Environment Variable Parsing
|
||||
|
||||
Date: 2025-01-21
|
||||
|
||||
## Status
|
||||
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
|
||||
In `cmd/cheat/main.go` lines 47-52, the code parses environment variables assuming they all contain an equals sign:
|
||||
|
||||
```go
|
||||
for _, e := range os.Environ() {
|
||||
pair := strings.SplitN(e, "=", 2)
|
||||
if runtime.GOOS == "windows" {
|
||||
pair[0] = strings.ToUpper(pair[0])
|
||||
}
|
||||
envvars[pair[0]] = pair[1] // Could panic if pair has < 2 elements
|
||||
}
|
||||
```
|
||||
|
||||
If `os.Environ()` returned a string without an equals sign, `strings.SplitN` would return a slice with only one element, causing a panic when accessing `pair[1]`.
|
||||
|
||||
## Decision
|
||||
|
||||
We will **not** add defensive checks for this condition. The current code that assumes all environment strings contain "=" will remain unchanged.
|
||||
|
||||
## Rationale
|
||||
|
||||
### Go Runtime Guarantees
|
||||
|
||||
Go's official documentation guarantees that `os.Environ()` returns environment variables in the form "key=value". This is a documented contract of the Go runtime that has been stable since Go 1.0.
|
||||
|
||||
### Empirical Evidence
|
||||
|
||||
Testing across platforms confirms:
|
||||
- All environment variables returned by `os.Environ()` contain at least one "="
|
||||
- Empty environment variables appear as "KEY=" (with an empty value)
|
||||
- Even Windows special variables like "=C:=C:\path" maintain the format
|
||||
|
||||
### Cost-Benefit Analysis
|
||||
|
||||
Adding defensive code would:
|
||||
- **Cost**: Add complexity and cognitive overhead
|
||||
- **Cost**: Suggest uncertainty about Go's documented behavior
|
||||
- **Cost**: Create dead code that can never execute under normal conditions
|
||||
- **Benefit**: Protect against a theoretical scenario that violates Go's guarantees
|
||||
|
||||
The only scenarios where this could panic are:
|
||||
1. A bug in Go's runtime (extremely unlikely, would affect all Go programs)
|
||||
2. Corrupted OS-level environment (would cause broader system issues)
|
||||
3. Breaking change in future Go version (would break many programs, unlikely)
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
- Simpler, more readable code
|
||||
- Trust in platform guarantees reduces unnecessary defensive programming
|
||||
- No performance overhead from unnecessary checks
|
||||
|
||||
### Negative
|
||||
- Theoretical panic if Go's guarantees are violated
|
||||
|
||||
### Neutral
|
||||
- Follows Go community standards of trusting standard library contracts
|
||||
|
||||
## Alternatives Considered
|
||||
|
||||
### 1. Add Defensive Check
|
||||
```go
|
||||
if len(pair) < 2 {
|
||||
continue // or pair[1] = ""
|
||||
}
|
||||
```
|
||||
**Rejected**: Adds complexity for a condition that should never occur.
|
||||
|
||||
### 2. Add Panic with Clear Message
|
||||
```go
|
||||
if len(pair) < 2 {
|
||||
panic("os.Environ() contract violation: " + e)
|
||||
}
|
||||
```
|
||||
**Rejected**: Would crash the program for the same theoretical issue.
|
||||
|
||||
### 3. Add Comment Documenting Assumption
|
||||
```go
|
||||
// os.Environ() guarantees "key=value" format, so pair[1] is safe
|
||||
envvars[pair[0]] = pair[1]
|
||||
```
|
||||
**Rejected**: While documentation is good, this particular guarantee is fundamental to Go.
|
||||
|
||||
## Notes
|
||||
|
||||
If Go ever changes this behavior (extremely unlikely as it would break compatibility), it would be caught immediately in testing as the program would panic on startup. This would be a clear signal to revisit this decision.
|
||||
|
||||
## References
|
||||
|
||||
- Go os.Environ() documentation: https://pkg.go.dev/os#Environ
|
||||
- Go os.Environ() source code and tests
|
||||
104
doc/adr/003-search-parallelization.md
Normal file
104
doc/adr/003-search-parallelization.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# ADR-003: No Parallelization for Search Operations
|
||||
|
||||
Date: 2025-01-22
|
||||
|
||||
## Status
|
||||
|
||||
Accepted
|
||||
|
||||
## Context
|
||||
|
||||
We investigated optimizing cheat's search performance through parallelization. Initial assumptions suggested that I/O operations (reading multiple cheatsheet files) would be the primary bottleneck, making parallel processing beneficial.
|
||||
|
||||
Performance benchmarks were implemented to measure search operations, and a parallel search implementation using goroutines was created and tested.
|
||||
|
||||
## Decision
|
||||
|
||||
We will **not** implement parallel search. The sequential implementation will remain unchanged.
|
||||
|
||||
## Rationale
|
||||
|
||||
### Performance Profile Analysis
|
||||
|
||||
CPU profiling revealed that search performance is dominated by:
|
||||
- **Process creation overhead** (~30% in `os/exec.(*Cmd).Run`)
|
||||
- **System calls** (~30% in `syscall.Syscall6`)
|
||||
- **Process management** (fork, exec, pipe setup)
|
||||
|
||||
The actual search logic (regex matching, file reading) was negligible in the profile, indicating our optimization efforts were targeting the wrong bottleneck.
|
||||
|
||||
### Benchmark Results
|
||||
|
||||
Parallel implementation showed minimal improvements:
|
||||
- Simple search: 17ms → 15.3ms (10% improvement)
|
||||
- Regex search: 15ms → 14.9ms (minimal improvement)
|
||||
- Colorized search: 19.5ms → 16.8ms (14% improvement)
|
||||
- Complex regex: 20ms → 15.3ms (24% improvement)
|
||||
|
||||
The best case saved only ~5ms in absolute terms.
|
||||
|
||||
### Cost-Benefit Analysis
|
||||
|
||||
**Costs of parallelization:**
|
||||
- Added complexity with goroutines, channels, and synchronization
|
||||
- Increased maintenance burden
|
||||
- More difficult debugging and testing
|
||||
- Potential race conditions
|
||||
|
||||
**Benefits:**
|
||||
- 5-15% performance improvement (5ms in real terms)
|
||||
- Imperceptible to users in interactive use
|
||||
|
||||
### User Experience Perspective
|
||||
|
||||
For a command-line tool:
|
||||
- Current 15-20ms response time is excellent
|
||||
- Users cannot perceive 5ms differences
|
||||
- Sub-50ms is considered "instant" in HCI research
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
- Simpler, more maintainable codebase
|
||||
- Easier to debug and reason about
|
||||
- No synchronization bugs or race conditions
|
||||
- Focus remains on code clarity
|
||||
|
||||
### Negative
|
||||
- Missed opportunity for ~5ms performance gain
|
||||
- Search remains single-threaded
|
||||
|
||||
### Neutral
|
||||
- Performance remains excellent for intended use case
|
||||
- Follows Go philosophy of preferring simplicity
|
||||
|
||||
## Alternatives Considered
|
||||
|
||||
### 1. Keep Parallel Implementation
|
||||
**Rejected**: Complexity outweighs negligible performance gains.
|
||||
|
||||
### 2. Optimize Process Startup
|
||||
**Rejected**: Process creation overhead is inherent to CLI tools and cannot be avoided without fundamental architecture changes.
|
||||
|
||||
### 3. Future Optimizations
|
||||
If performance becomes critical, consider:
|
||||
- **Long-running daemon**: Eliminate process startup overhead entirely
|
||||
- **Shell function**: Reduce fork/exec overhead
|
||||
- **Compiled-in cheatsheets**: Eliminate file I/O
|
||||
|
||||
However, these would fundamentally change the tool's architecture and usage model.
|
||||
|
||||
## Notes
|
||||
|
||||
This decision reinforces important principles:
|
||||
1. Always profile before optimizing
|
||||
2. Consider the full execution context
|
||||
3. Measure what matters to users
|
||||
4. Complexity has a real cost
|
||||
|
||||
The parallelization attempt was valuable as a learning exercise and definitively answered whether this optimization path was worthwhile.
|
||||
|
||||
## References
|
||||
|
||||
- Benchmark implementation: cmd/cheat/search_bench_test.go
|
||||
- Reverted parallel implementation: see git history (commit 82eb918)
|
||||
38
doc/adr/README.md
Normal file
38
doc/adr/README.md
Normal file
@@ -0,0 +1,38 @@
|
||||
# Architecture Decision Records
|
||||
|
||||
This directory contains Architecture Decision Records (ADRs) for the cheat project.
|
||||
|
||||
## What is an ADR?
|
||||
|
||||
An Architecture Decision Record captures an important architectural decision made along with its context and consequences. ADRs help us:
|
||||
|
||||
- Document why decisions were made
|
||||
- Understand the context and trade-offs
|
||||
- Review decisions when requirements change
|
||||
- Onboard new contributors
|
||||
|
||||
## ADR Format
|
||||
|
||||
Each ADR follows this template:
|
||||
|
||||
1. **Title**: ADR-NNN: Brief description
|
||||
2. **Date**: When the decision was made
|
||||
3. **Status**: Proposed, Accepted, Deprecated, Superseded
|
||||
4. **Context**: What prompted this decision?
|
||||
5. **Decision**: What did we decide to do?
|
||||
6. **Consequences**: What are the positive, negative, and neutral outcomes?
|
||||
|
||||
## Index of ADRs
|
||||
|
||||
| ADR | Title | Status | Date |
|
||||
|-----|-------|--------|------|
|
||||
| [001](001-path-traversal-protection.md) | Path Traversal Protection for Cheatsheet Names | Accepted | 2025-01-21 |
|
||||
| [002](002-environment-variable-parsing.md) | No Defensive Checks for Environment Variable Parsing | Accepted | 2025-01-21 |
|
||||
| [003](003-search-parallelization.md) | No Parallelization for Search Operations | Accepted | 2025-01-22 |
|
||||
|
||||
## Creating a New ADR
|
||||
|
||||
1. Copy the template from an existing ADR
|
||||
2. Use the next sequential number
|
||||
3. Fill in all sections
|
||||
4. Include the ADR alongside the commit implementing the decision
|
||||
98
doc/cheat.1
98
doc/cheat.1
@@ -1,31 +1,14 @@
|
||||
.\" Automatically generated by Pandoc 2.17.1.1
|
||||
.\" Automatically generated by Pandoc 3.1.11.1
|
||||
.\"
|
||||
.\" Define V font for inline verbatim, using C font in formats
|
||||
.\" that render this, and otherwise B font.
|
||||
.ie "\f[CB]x\f[]"x" \{\
|
||||
. ftr V B
|
||||
. ftr VI BI
|
||||
. ftr VB B
|
||||
. ftr VBI BI
|
||||
.\}
|
||||
.el \{\
|
||||
. ftr V CR
|
||||
. ftr VI CI
|
||||
. ftr VB CB
|
||||
. ftr VBI CBI
|
||||
.\}
|
||||
.TH "CHEAT" "1" "" "" "General Commands Manual"
|
||||
.hy
|
||||
.SH NAME
|
||||
.PP
|
||||
\f[B]cheat\f[R] \[em] create and view command-line cheatsheets
|
||||
\f[B]cheat\f[R] \[em] create and view command\-line cheatsheets
|
||||
.SH SYNOPSIS
|
||||
.PP
|
||||
\f[B]cheat\f[R] [options] [\f[I]CHEATSHEET\f[R]]
|
||||
.SH DESCRIPTION
|
||||
.PP
|
||||
\f[B]cheat\f[R] allows you to create and view interactive cheatsheets on
|
||||
the command-line.
|
||||
the command\-line.
|
||||
It was designed to help remind *nix system administrators of options for
|
||||
commands that they use frequently, but not frequently enough to
|
||||
remember.
|
||||
@@ -34,34 +17,40 @@ remember.
|
||||
\[en]init
|
||||
Print a config file to stdout.
|
||||
.TP
|
||||
-c, \[en]colorize
|
||||
\[en]conf
|
||||
Display the config file path.
|
||||
.TP
|
||||
\-a, \[en]all
|
||||
Search among all cheatpaths.
|
||||
.TP
|
||||
\-c, \[en]colorize
|
||||
Colorize output.
|
||||
.TP
|
||||
-d, \[en]directories
|
||||
\-d, \[en]directories
|
||||
List cheatsheet directories.
|
||||
.TP
|
||||
-e, \[en]edit=\f[I]CHEATSHEET\f[R]
|
||||
\-e, \[en]edit=\f[I]CHEATSHEET\f[R]
|
||||
Open \f[I]CHEATSHEET\f[R] for editing.
|
||||
.TP
|
||||
-l, \[en]list
|
||||
\-l, \[en]list
|
||||
List available cheatsheets.
|
||||
.TP
|
||||
-p, \[en]path=\f[I]PATH\f[R]
|
||||
\-p, \[en]path=\f[I]PATH\f[R]
|
||||
Filter only to sheets found on path \f[I]PATH\f[R].
|
||||
.TP
|
||||
-r, \[en]regex
|
||||
\-r, \[en]regex
|
||||
Treat search \f[I]PHRASE\f[R] as a regular expression.
|
||||
.TP
|
||||
-s, \[en]search=\f[I]PHRASE\f[R]
|
||||
\-s, \[en]search=\f[I]PHRASE\f[R]
|
||||
Search cheatsheets for \f[I]PHRASE\f[R].
|
||||
.TP
|
||||
-t, \[en]tag=\f[I]TAG\f[R]
|
||||
\-t, \[en]tag=\f[I]TAG\f[R]
|
||||
Filter only to sheets tagged with \f[I]TAG\f[R].
|
||||
.TP
|
||||
-T, \[en]tags
|
||||
\-T, \[en]tags
|
||||
List all tags in use.
|
||||
.TP
|
||||
-v, \[en]version
|
||||
\-v, \[en]version
|
||||
Print the version number.
|
||||
.TP
|
||||
\[en]rm=\f[I]CHEATSHEET\f[R]
|
||||
@@ -72,37 +61,39 @@ To view the foo cheatsheet:
|
||||
cheat \f[I]foo\f[R]
|
||||
.TP
|
||||
To edit (or create) the foo cheatsheet:
|
||||
cheat -e \f[I]foo\f[R]
|
||||
cheat \-e \f[I]foo\f[R]
|
||||
.TP
|
||||
To edit (or create) the foo/bar cheatsheet on the `work' cheatpath:
|
||||
cheat -p \f[I]work\f[R] -e \f[I]foo/bar\f[R]
|
||||
cheat \-p \f[I]work\f[R] \-e \f[I]foo/bar\f[R]
|
||||
.TP
|
||||
To view all cheatsheet directories:
|
||||
cheat -d
|
||||
cheat \-d
|
||||
.TP
|
||||
To list all available cheatsheets:
|
||||
cheat -l
|
||||
cheat \-l
|
||||
.TP
|
||||
To list all cheatsheets whose titles match `apt':
|
||||
cheat -l \f[I]apt\f[R]
|
||||
cheat \-l \f[I]apt\f[R]
|
||||
.TP
|
||||
To list all tags in use:
|
||||
cheat -T
|
||||
cheat \-T
|
||||
.TP
|
||||
To list available cheatsheets that are tagged as `personal':
|
||||
cheat -l -t \f[I]personal\f[R]
|
||||
cheat \-l \-t \f[I]personal\f[R]
|
||||
.TP
|
||||
To search for `ssh' among all cheatsheets, and colorize matches:
|
||||
cheat -c -s \f[I]ssh\f[R]
|
||||
cheat \-c \-s \f[I]ssh\f[R]
|
||||
.TP
|
||||
To search (by regex) for cheatsheets that contain an IP address:
|
||||
cheat -c -r -s \f[I]`(?:[0-9]{1,3}.){3}[0-9]{1,3}'\f[R]
|
||||
cheat \-c \-r \-s \f[I]`(?:[0\-9]{1,3}.){3}[0\-9]{1,3}'\f[R]
|
||||
.TP
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
cheat \[en]rm \f[I]foo/bar\f[R]
|
||||
.TP
|
||||
To view the configuration file path:
|
||||
cheat \[en]conf
|
||||
.SH FILES
|
||||
.SS Configuration
|
||||
.PP
|
||||
\f[B]cheat\f[R] is configured via a YAML file that is conventionally
|
||||
named \f[I]conf.yaml\f[R].
|
||||
\f[B]cheat\f[R] will search for \f[I]conf.yaml\f[R] in varying
|
||||
@@ -133,24 +124,28 @@ Alternatively, you may also generate a config file manually by running
|
||||
\f[B]cheat \[en]init\f[R] and saving its output to the appropriate
|
||||
location for your platform.
|
||||
.SS Cheatpaths
|
||||
.PP
|
||||
\f[B]cheat\f[R] reads its cheatsheets from \[lq]cheatpaths\[rq], which
|
||||
are the directories in which cheatsheets are stored.
|
||||
Cheatpaths may be configured in \f[I]conf.yaml\f[R], and viewed via
|
||||
\f[B]cheat -d\f[R].
|
||||
\f[B]cheat \-d\f[R].
|
||||
.PP
|
||||
For detailed instructions on how to configure cheatpaths, please refer
|
||||
to the comments in conf.yml.
|
||||
.SS Autocompletion
|
||||
.PP
|
||||
Autocompletion scripts for \f[B]bash\f[R], \f[B]zsh\f[R], and
|
||||
\f[B]fish\f[R] are available for download:
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.bash>
|
||||
\c
|
||||
.UR https://github.com/cheat/cheat/blob/master/scripts/cheat.bash
|
||||
.UE \c
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.fish>
|
||||
\c
|
||||
.UR https://github.com/cheat/cheat/blob/master/scripts/cheat.fish
|
||||
.UE \c
|
||||
.IP \[bu] 2
|
||||
<https://github.com/cheat/cheat/blob/master/scripts/cheat.zsh>
|
||||
\c
|
||||
.UR https://github.com/cheat/cheat/blob/master/scripts/cheat.zsh
|
||||
.UE \c
|
||||
.PP
|
||||
The \f[B]bash\f[R] and \f[B]zsh\f[R] scripts provide optional
|
||||
integration with \f[B]fzf\f[R], if the latter is available on your
|
||||
@@ -176,11 +171,12 @@ Application error
|
||||
.IP "2." 3
|
||||
Cheatsheet(s) not found
|
||||
.SH BUGS
|
||||
.PP
|
||||
See GitHub issues: <https://github.com/cheat/cheat/issues>
|
||||
See GitHub issues: \c
|
||||
.UR https://github.com/cheat/cheat/issues
|
||||
.UE \c
|
||||
.SH AUTHOR
|
||||
.PP
|
||||
Christopher Allen Lane <chris@chris-allen-lane.com>
|
||||
Christopher Allen Lane \c
|
||||
.MT chris@chris-allen-lane.com
|
||||
.ME \c
|
||||
.SH SEE ALSO
|
||||
.PP
|
||||
\f[B]fzf(1)\f[R]
|
||||
|
||||
@@ -23,6 +23,12 @@ OPTIONS
|
||||
--init
|
||||
: Print a config file to stdout.
|
||||
|
||||
--conf
|
||||
: Display the config file path.
|
||||
|
||||
-a, --all
|
||||
: Search among all cheatpaths.
|
||||
|
||||
-c, --colorize
|
||||
: Colorize output.
|
||||
|
||||
@@ -93,6 +99,9 @@ To search (by regex) for cheatsheets that contain an IP address:
|
||||
To remove (delete) the foo/bar cheatsheet:
|
||||
: cheat --rm _foo/bar_
|
||||
|
||||
To view the configuration file path:
|
||||
: cheat --conf
|
||||
|
||||
|
||||
FILES
|
||||
=====
|
||||
|
||||
4
go.mod
4
go.mod
@@ -3,7 +3,7 @@ module github.com/cheat/cheat
|
||||
go 1.19
|
||||
|
||||
require (
|
||||
github.com/alecthomas/chroma/v2 v2.14.0
|
||||
github.com/alecthomas/chroma/v2 v2.12.0
|
||||
github.com/davecgh/go-spew v1.1.1
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815
|
||||
github.com/go-git/go-git/v5 v5.11.0
|
||||
@@ -18,7 +18,7 @@ require (
|
||||
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c // indirect
|
||||
github.com/cloudflare/circl v1.3.7 // indirect
|
||||
github.com/cyphar/filepath-securejoin v0.2.4 // indirect
|
||||
github.com/dlclark/regexp2 v1.11.0 // indirect
|
||||
github.com/dlclark/regexp2 v1.10.0 // indirect
|
||||
github.com/emirpasic/gods v1.18.1 // indirect
|
||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect
|
||||
github.com/go-git/go-billy/v5 v5.5.0 // indirect
|
||||
|
||||
12
go.sum
12
go.sum
@@ -5,10 +5,10 @@ github.com/Microsoft/go-winio v0.6.1 h1:9/kr64B9VUZrLm5YYwbGtUJnMgqWVOdUAXu6Migc
|
||||
github.com/Microsoft/go-winio v0.6.1/go.mod h1:LRdKpFKfdobln8UmuiYcKPot9D2v6svN5+sAH+4kjUM=
|
||||
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c h1:kMFnB0vCcX7IL/m9Y5LO+KQYv+t1CQOiFe6+SV2J7bE=
|
||||
github.com/ProtonMail/go-crypto v0.0.0-20230923063757-afb1ddc0824c/go.mod h1:EjAoLdwvbIOoOQr3ihjnSoLZRtE8azugULFRteWMNc0=
|
||||
github.com/alecthomas/assert/v2 v2.7.0 h1:QtqSACNS3tF7oasA8CU6A6sXZSBDqnm7RfpLl9bZqbE=
|
||||
github.com/alecthomas/chroma/v2 v2.14.0 h1:R3+wzpnUArGcQz7fCETQBzO5n9IMNi13iIs46aU4V9E=
|
||||
github.com/alecthomas/chroma/v2 v2.14.0/go.mod h1:QolEbTfmUHIMVpBqxeDnNBj2uoeI4EbYP4i6n68SG4I=
|
||||
github.com/alecthomas/repr v0.4.0 h1:GhI2A8MACjfegCPVq9f1FLvIBS+DrQ2KQBFZP1iFzXc=
|
||||
github.com/alecthomas/assert/v2 v2.2.1 h1:XivOgYcduV98QCahG8T5XTezV5bylXe+lBxLG2K2ink=
|
||||
github.com/alecthomas/chroma/v2 v2.12.0 h1:Wh8qLEgMMsN7mgyG8/qIpegky2Hvzr4By6gEF7cmWgw=
|
||||
github.com/alecthomas/chroma/v2 v2.12.0/go.mod h1:4TQu7gdfuPjSh76j78ietmqh9LiurGF0EpseFXdKMBw=
|
||||
github.com/alecthomas/repr v0.2.0 h1:HAzS41CIzNW5syS8Mf9UwXhNH1J9aix/BvDRf1Ml2Yk=
|
||||
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
|
||||
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
|
||||
github.com/bwesterb/go-ristretto v1.2.3/go.mod h1:fUIoIZaG73pV5biE2Blr2xEzDoMj7NFEuV9ekS419A0=
|
||||
@@ -20,8 +20,8 @@ github.com/cyphar/filepath-securejoin v0.2.4/go.mod h1:aPGpWjXOXUn2NCNjFvBE6aRxG
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/dlclark/regexp2 v1.11.0 h1:G/nrcoOa7ZXlpoa/91N3X7mM3r8eIlMBBJZvsz/mxKI=
|
||||
github.com/dlclark/regexp2 v1.11.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
|
||||
github.com/dlclark/regexp2 v1.10.0 h1:+/GIL799phkJqYW+3YbOd8LCcbHzT0Pbo8zl70MHsq0=
|
||||
github.com/dlclark/regexp2 v1.10.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815 h1:bWDMxwH3px2JBh6AyO7hdCn/PkvCZXii8TGj7sbtEbQ=
|
||||
github.com/docopt/docopt-go v0.0.0-20180111231733-ee0de3bc6815/go.mod h1:WwZ+bS3ebgob9U8Nd0kOddGdZWjyMGR8Wziv+TBNwSE=
|
||||
github.com/elazarl/goproxy v0.0.0-20230808193330-2592e75ae04a h1:mATvB/9r/3gvcejNsXKSkQ6lcIaNec2nyfOdlTBR2lU=
|
||||
|
||||
@@ -2,6 +2,8 @@
|
||||
// management.
|
||||
package cheatpath
|
||||
|
||||
import "fmt"
|
||||
|
||||
// Cheatpath encapsulates cheatsheet path information
|
||||
type Cheatpath struct {
|
||||
Name string `yaml:"name"`
|
||||
@@ -9,3 +11,18 @@ type Cheatpath struct {
|
||||
ReadOnly bool `yaml:"readonly"`
|
||||
Tags []string `yaml:"tags"`
|
||||
}
|
||||
|
||||
// Validate ensures that the Cheatpath is valid
|
||||
func (c Cheatpath) Validate() error {
|
||||
// Check that name is not empty
|
||||
if c.Name == "" {
|
||||
return fmt.Errorf("cheatpath name cannot be empty")
|
||||
}
|
||||
|
||||
// Check that path is not empty
|
||||
if c.Path == "" {
|
||||
return fmt.Errorf("cheatpath path cannot be empty")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
113
internal/cheatpath/cheatpath_test.go
Normal file
113
internal/cheatpath/cheatpath_test.go
Normal file
@@ -0,0 +1,113 @@
|
||||
package cheatpath
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestCheatpathValidate(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
cheatpath Cheatpath
|
||||
wantErr bool
|
||||
errMsg string
|
||||
}{
|
||||
{
|
||||
name: "valid cheatpath",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "personal",
|
||||
Path: "/home/user/.config/cheat/personal",
|
||||
ReadOnly: false,
|
||||
Tags: []string{"personal"},
|
||||
},
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "empty name",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "",
|
||||
Path: "/home/user/.config/cheat/personal",
|
||||
ReadOnly: false,
|
||||
Tags: []string{"personal"},
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "cheatpath name cannot be empty",
|
||||
},
|
||||
{
|
||||
name: "empty path",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "personal",
|
||||
Path: "",
|
||||
ReadOnly: false,
|
||||
Tags: []string{"personal"},
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "cheatpath path cannot be empty",
|
||||
},
|
||||
{
|
||||
name: "both empty",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "",
|
||||
Path: "",
|
||||
ReadOnly: true,
|
||||
Tags: nil,
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "cheatpath name cannot be empty",
|
||||
},
|
||||
{
|
||||
name: "minimal valid",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "x",
|
||||
Path: "/",
|
||||
},
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "with readonly and tags",
|
||||
cheatpath: Cheatpath{
|
||||
Name: "community",
|
||||
Path: "/usr/share/cheat",
|
||||
ReadOnly: true,
|
||||
Tags: []string{"community", "shared", "readonly"},
|
||||
},
|
||||
wantErr: false,
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
err := tt.cheatpath.Validate()
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Validate() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if err != nil && tt.errMsg != "" && !strings.Contains(err.Error(), tt.errMsg) {
|
||||
t.Errorf("Validate() error = %v, want error containing %q", err, tt.errMsg)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheatpathStruct(t *testing.T) {
|
||||
// Test that the struct fields work as expected
|
||||
cp := Cheatpath{
|
||||
Name: "test",
|
||||
Path: "/test/path",
|
||||
ReadOnly: true,
|
||||
Tags: []string{"tag1", "tag2"},
|
||||
}
|
||||
|
||||
if cp.Name != "test" {
|
||||
t.Errorf("expected Name to be 'test', got %q", cp.Name)
|
||||
}
|
||||
if cp.Path != "/test/path" {
|
||||
t.Errorf("expected Path to be '/test/path', got %q", cp.Path)
|
||||
}
|
||||
if !cp.ReadOnly {
|
||||
t.Error("expected ReadOnly to be true")
|
||||
}
|
||||
if len(cp.Tags) != 2 || cp.Tags[0] != "tag1" || cp.Tags[1] != "tag2" {
|
||||
t.Errorf("expected Tags to be [tag1 tag2], got %v", cp.Tags)
|
||||
}
|
||||
}
|
||||
63
internal/cheatpath/doc.go
Normal file
63
internal/cheatpath/doc.go
Normal file
@@ -0,0 +1,63 @@
|
||||
// Package cheatpath manages collections of cheat sheets organized in filesystem directories.
|
||||
//
|
||||
// A Cheatpath represents a directory containing cheat sheets, with associated
|
||||
// metadata such as tags and read-only status. Multiple cheatpaths can be
|
||||
// configured to organize sheets from different sources (personal, community, work, etc.).
|
||||
//
|
||||
// # Cheatpath Structure
|
||||
//
|
||||
// Each cheatpath has:
|
||||
// - Name: A friendly identifier (e.g., "personal", "community")
|
||||
// - Path: The filesystem path to the directory
|
||||
// - Tags: Tags automatically applied to all sheets in this path
|
||||
// - ReadOnly: Whether sheets in this path can be modified
|
||||
//
|
||||
// Example configuration:
|
||||
//
|
||||
// cheatpaths:
|
||||
// - name: personal
|
||||
// path: ~/cheat
|
||||
// tags: []
|
||||
// readonly: false
|
||||
// - name: community
|
||||
// path: ~/cheat/community
|
||||
// tags: [community]
|
||||
// readonly: true
|
||||
//
|
||||
// # Directory-Scoped Cheatpaths
|
||||
//
|
||||
// The package supports directory-scoped cheatpaths via `.cheat` directories.
|
||||
// When running cheat from a directory containing a `.cheat` subdirectory,
|
||||
// that directory is temporarily added to the available cheatpaths.
|
||||
//
|
||||
// # Precedence and Overrides
|
||||
//
|
||||
// When multiple cheatpaths contain a sheet with the same name, the sheet
|
||||
// from the most "local" cheatpath takes precedence. This allows users to
|
||||
// override community sheets with personal versions.
|
||||
//
|
||||
// Key Functions
|
||||
//
|
||||
// - Filter: Filters cheatpaths by name
|
||||
// - Validate: Ensures cheatpath configuration is valid
|
||||
// - Writeable: Returns the first writeable cheatpath
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Filter cheatpaths to only "personal"
|
||||
// filtered, err := cheatpath.Filter(paths, "personal")
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Find a writeable cheatpath
|
||||
// writeable, err := cheatpath.Writeable(paths)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Validate cheatpath configuration
|
||||
// if err := cheatpath.Validate(paths); err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
package cheatpath
|
||||
@@ -2,16 +2,38 @@ package cheatpath
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Validate returns an error if the cheatpath is invalid
|
||||
func (c *Cheatpath) Validate() error {
|
||||
|
||||
if c.Name == "" {
|
||||
return fmt.Errorf("invalid cheatpath: name must be specified")
|
||||
// ValidateSheetName ensures that a cheatsheet name does not contain
|
||||
// directory traversal sequences or other potentially dangerous patterns.
|
||||
func ValidateSheetName(name string) error {
|
||||
// Reject empty names
|
||||
if name == "" {
|
||||
return fmt.Errorf("cheatsheet name cannot be empty")
|
||||
}
|
||||
if c.Path == "" {
|
||||
return fmt.Errorf("invalid cheatpath: path must be specified")
|
||||
|
||||
// Reject names containing directory traversal
|
||||
if strings.Contains(name, "..") {
|
||||
return fmt.Errorf("cheatsheet name cannot contain '..'")
|
||||
}
|
||||
|
||||
// Reject absolute paths
|
||||
if filepath.IsAbs(name) {
|
||||
return fmt.Errorf("cheatsheet name cannot be an absolute path")
|
||||
}
|
||||
|
||||
// Reject names that start with ~ (home directory expansion)
|
||||
if strings.HasPrefix(name, "~") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '~'")
|
||||
}
|
||||
|
||||
// Reject hidden files (files that start with a dot)
|
||||
// We don't display hidden files, so we shouldn't create them
|
||||
filename := filepath.Base(name)
|
||||
if strings.HasPrefix(filename, ".") {
|
||||
return fmt.Errorf("cheatsheet name cannot start with '.' (hidden files are not supported)")
|
||||
}
|
||||
|
||||
return nil
|
||||
|
||||
169
internal/cheatpath/validate_fuzz_test.go
Normal file
169
internal/cheatpath/validate_fuzz_test.go
Normal file
@@ -0,0 +1,169 @@
|
||||
package cheatpath
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
// FuzzValidateSheetName tests the ValidateSheetName function with fuzzing
|
||||
// to ensure it properly prevents path traversal and other security issues
|
||||
func FuzzValidateSheetName(f *testing.F) {
|
||||
// Add seed corpus with various valid and malicious inputs
|
||||
// Valid names
|
||||
f.Add("docker")
|
||||
f.Add("docker/compose")
|
||||
f.Add("lang/go/slice")
|
||||
f.Add("my-cheat_sheet")
|
||||
f.Add("file.txt")
|
||||
f.Add("a")
|
||||
f.Add("123")
|
||||
|
||||
// Path traversal attempts
|
||||
f.Add("..")
|
||||
f.Add("../etc/passwd")
|
||||
f.Add("foo/../bar")
|
||||
f.Add("foo/../../etc/passwd")
|
||||
f.Add("..\\windows\\system32")
|
||||
f.Add("foo\\..\\..\\windows")
|
||||
|
||||
// Encoded traversal attempts
|
||||
f.Add("%2e%2e")
|
||||
f.Add("%2e%2e%2f")
|
||||
f.Add("..%2f")
|
||||
f.Add("%2e.")
|
||||
f.Add(".%2e")
|
||||
f.Add("\x2e\x2e")
|
||||
f.Add("\\x2e\\x2e")
|
||||
|
||||
// Unicode and special characters
|
||||
f.Add("€test")
|
||||
f.Add("test€")
|
||||
f.Add("中文")
|
||||
f.Add("🎉emoji")
|
||||
f.Add("\x00null")
|
||||
f.Add("test\x00null")
|
||||
f.Add("\nnewline")
|
||||
f.Add("test\ttab")
|
||||
|
||||
// Absolute paths
|
||||
f.Add("/etc/passwd")
|
||||
f.Add("C:\\Windows\\System32")
|
||||
f.Add("\\\\server\\share")
|
||||
f.Add("//server/share")
|
||||
|
||||
// Home directory
|
||||
f.Add("~")
|
||||
f.Add("~/config")
|
||||
f.Add("~user/file")
|
||||
|
||||
// Hidden files
|
||||
f.Add(".hidden")
|
||||
f.Add("dir/.hidden")
|
||||
f.Add(".git/config")
|
||||
|
||||
// Edge cases
|
||||
f.Add("")
|
||||
f.Add(" ")
|
||||
f.Add(" ")
|
||||
f.Add("\t")
|
||||
f.Add(".")
|
||||
f.Add("./")
|
||||
f.Add("./file")
|
||||
f.Add(".../")
|
||||
f.Add("...")
|
||||
f.Add("....")
|
||||
|
||||
// Very long names
|
||||
f.Add(strings.Repeat("a", 255))
|
||||
f.Add(strings.Repeat("a/", 100) + "file")
|
||||
f.Add(strings.Repeat("../", 50) + "etc/passwd")
|
||||
|
||||
f.Fuzz(func(t *testing.T, input string) {
|
||||
// The function should never panic
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("ValidateSheetName panicked with input %q: %v", input, r)
|
||||
}
|
||||
}()
|
||||
|
||||
err := ValidateSheetName(input)
|
||||
|
||||
// Security invariants that must always hold
|
||||
if err == nil {
|
||||
// If validation passed, verify security properties
|
||||
|
||||
// Should not contain ".." for path traversal
|
||||
if strings.Contains(input, "..") {
|
||||
t.Errorf("validation passed but input contains '..': %q", input)
|
||||
}
|
||||
|
||||
// Should not be empty
|
||||
if input == "" {
|
||||
t.Error("validation passed for empty input")
|
||||
}
|
||||
|
||||
// Should not start with ~ (home directory)
|
||||
if strings.HasPrefix(input, "~") {
|
||||
t.Errorf("validation passed but input starts with '~': %q", input)
|
||||
}
|
||||
|
||||
// Base filename should not start with .
|
||||
parts := strings.Split(input, "/")
|
||||
if len(parts) > 0 {
|
||||
lastPart := parts[len(parts)-1]
|
||||
if strings.HasPrefix(lastPart, ".") && lastPart != "." {
|
||||
t.Errorf("validation passed but filename starts with '.': %q", input)
|
||||
}
|
||||
}
|
||||
|
||||
// Additional check: result should be valid UTF-8
|
||||
if !utf8.ValidString(input) {
|
||||
// While the function doesn't explicitly check this,
|
||||
// we want to ensure it handles invalid UTF-8 gracefully
|
||||
t.Logf("validation passed for invalid UTF-8: %q", input)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzValidateSheetNamePathTraversal specifically targets path traversal bypasses
|
||||
func FuzzValidateSheetNamePathTraversal(f *testing.F) {
|
||||
// Seed corpus focusing on path traversal variations
|
||||
f.Add("..", "/", "")
|
||||
f.Add("", "..", "/")
|
||||
f.Add("a", "b", "c")
|
||||
|
||||
f.Fuzz(func(t *testing.T, prefix string, middle string, suffix string) {
|
||||
// Construct various path traversal attempts
|
||||
inputs := []string{
|
||||
prefix + ".." + suffix,
|
||||
prefix + "/.." + suffix,
|
||||
prefix + "\\.." + suffix,
|
||||
prefix + middle + ".." + suffix,
|
||||
prefix + "../" + middle + suffix,
|
||||
prefix + "..%2f" + suffix,
|
||||
prefix + "%2e%2e" + suffix,
|
||||
prefix + "%2e%2e%2f" + suffix,
|
||||
}
|
||||
|
||||
for _, input := range inputs {
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("ValidateSheetName panicked with constructed input %q: %v", input, r)
|
||||
}
|
||||
}()
|
||||
|
||||
err := ValidateSheetName(input)
|
||||
|
||||
// If the input contains literal "..", it must be rejected
|
||||
if strings.Contains(input, "..") && err == nil {
|
||||
t.Errorf("validation incorrectly passed for input containing '..': %q", input)
|
||||
}
|
||||
}()
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -1,56 +1,106 @@
|
||||
package cheatpath
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestValidateValid asserts that valid cheatpaths validate successfully
|
||||
func TestValidateValid(t *testing.T) {
|
||||
|
||||
// initialize a valid cheatpath
|
||||
cheatpath := Cheatpath{
|
||||
Name: "foo",
|
||||
Path: "/foo",
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
func TestValidateSheetName(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
input string
|
||||
wantErr bool
|
||||
errMsg string
|
||||
}{
|
||||
// Valid names
|
||||
{
|
||||
name: "simple name",
|
||||
input: "docker",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "name with slash",
|
||||
input: "docker/compose",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "name with multiple slashes",
|
||||
input: "lang/go/slice",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "name with dash and underscore",
|
||||
input: "my-cheat_sheet",
|
||||
wantErr: false,
|
||||
},
|
||||
// Invalid names
|
||||
{
|
||||
name: "empty name",
|
||||
input: "",
|
||||
wantErr: true,
|
||||
errMsg: "empty",
|
||||
},
|
||||
{
|
||||
name: "parent directory traversal",
|
||||
input: "../etc/passwd",
|
||||
wantErr: true,
|
||||
errMsg: "'..'",
|
||||
},
|
||||
{
|
||||
name: "complex traversal",
|
||||
input: "foo/../../etc/passwd",
|
||||
wantErr: true,
|
||||
errMsg: "'..'",
|
||||
},
|
||||
{
|
||||
name: "absolute path",
|
||||
input: "/etc/passwd",
|
||||
wantErr: true,
|
||||
errMsg: "absolute",
|
||||
},
|
||||
{
|
||||
name: "home directory",
|
||||
input: "~/secrets",
|
||||
wantErr: true,
|
||||
errMsg: "'~'",
|
||||
},
|
||||
{
|
||||
name: "just dots",
|
||||
input: "..",
|
||||
wantErr: true,
|
||||
errMsg: "'..'",
|
||||
},
|
||||
{
|
||||
name: "hidden file not allowed",
|
||||
input: ".hidden",
|
||||
wantErr: true,
|
||||
errMsg: "cannot start with '.'",
|
||||
},
|
||||
{
|
||||
name: "current dir is ok",
|
||||
input: "./current",
|
||||
wantErr: false,
|
||||
},
|
||||
{
|
||||
name: "nested hidden file not allowed",
|
||||
input: "config/.gitignore",
|
||||
wantErr: true,
|
||||
errMsg: "cannot start with '.'",
|
||||
},
|
||||
}
|
||||
|
||||
// assert that no errors are returned
|
||||
if err := cheatpath.Validate(); err != nil {
|
||||
t.Errorf("failed to validate valid cheatpath: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidateMissingName asserts that paths that are missing a name fail to
|
||||
// validate
|
||||
func TestValidateMissingName(t *testing.T) {
|
||||
|
||||
// initialize a valid cheatpath
|
||||
cheatpath := Cheatpath{
|
||||
Path: "/foo",
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
}
|
||||
|
||||
// assert that no errors are returned
|
||||
if err := cheatpath.Validate(); err == nil {
|
||||
t.Errorf("failed to invalidate cheatpath without name")
|
||||
}
|
||||
}
|
||||
|
||||
// TestValidateMissingPath asserts that paths that are missing a path fail to
|
||||
// validate
|
||||
func TestValidateMissingPath(t *testing.T) {
|
||||
|
||||
// initialize a valid cheatpath
|
||||
cheatpath := Cheatpath{
|
||||
Name: "foo",
|
||||
ReadOnly: false,
|
||||
Tags: []string{},
|
||||
}
|
||||
|
||||
// assert that no errors are returned
|
||||
if err := cheatpath.Validate(); err == nil {
|
||||
t.Errorf("failed to invalidate cheatpath without path")
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
err := ValidateSheetName(tt.input)
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("ValidateName(%q) error = %v, wantErr %v", tt.input, err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if err != nil && tt.errMsg != "" {
|
||||
if !strings.Contains(err.Error(), tt.errMsg) {
|
||||
t.Errorf("ValidateName(%q) error = %v, want error containing %q", tt.input, err, tt.errMsg)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
@@ -96,6 +96,9 @@ func New(_ map[string]interface{}, confPath string, resolve bool) (Config, error
|
||||
conf.Cheatpaths[i].Path = expanded
|
||||
}
|
||||
|
||||
// trim editor whitespace
|
||||
conf.Editor = strings.TrimSpace(conf.Editor)
|
||||
|
||||
// if an editor was not provided in the configs, attempt to choose one
|
||||
// that's appropriate for the environment
|
||||
if conf.Editor == "" {
|
||||
|
||||
247
internal/config/config_extended_test.go
Normal file
247
internal/config/config_extended_test.go
Normal file
@@ -0,0 +1,247 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/mock"
|
||||
)
|
||||
|
||||
// TestConfigYAMLErrors tests YAML parsing errors
|
||||
func TestConfigYAMLErrors(t *testing.T) {
|
||||
// Create a temporary file with invalid YAML
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
invalidYAML := filepath.Join(tempDir, "invalid.yml")
|
||||
err = os.WriteFile(invalidYAML, []byte("invalid: yaml: content:\n - no closing"), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write invalid yaml: %v", err)
|
||||
}
|
||||
|
||||
// Attempt to load invalid YAML
|
||||
_, err = New(map[string]interface{}{}, invalidYAML, false)
|
||||
if err == nil {
|
||||
t.Error("expected error for invalid YAML, got nil")
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigLocalCheatpath tests local .cheat directory detection
|
||||
func TestConfigLocalCheatpath(t *testing.T) {
|
||||
// Create a temporary directory to act as working directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Save current working directory
|
||||
oldCwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
t.Fatalf("failed to get cwd: %v", err)
|
||||
}
|
||||
defer os.Chdir(oldCwd)
|
||||
|
||||
// Change to temp directory
|
||||
err = os.Chdir(tempDir)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to change dir: %v", err)
|
||||
}
|
||||
|
||||
// Create .cheat directory
|
||||
localCheat := filepath.Join(tempDir, ".cheat")
|
||||
err = os.Mkdir(localCheat, 0755)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create .cheat dir: %v", err)
|
||||
}
|
||||
|
||||
// Load config
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Check that local cheatpath was added
|
||||
found := false
|
||||
for _, cp := range conf.Cheatpaths {
|
||||
if cp.Name == "cwd" && cp.Path == localCheat {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
t.Error("local .cheat directory was not added to cheatpaths")
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigDefaults tests default values
|
||||
func TestConfigDefaults(t *testing.T) {
|
||||
// Load empty config
|
||||
conf, err := New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Check defaults
|
||||
if conf.Style != "bw" {
|
||||
t.Errorf("expected default style 'bw', got %s", conf.Style)
|
||||
}
|
||||
|
||||
if conf.Formatter != "terminal" {
|
||||
t.Errorf("expected default formatter 'terminal', got %s", conf.Formatter)
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigSymlinkResolution tests symlink resolution
|
||||
func TestConfigSymlinkResolution(t *testing.T) {
|
||||
// Create temp directory structure
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create target directory
|
||||
targetDir := filepath.Join(tempDir, "target")
|
||||
err = os.Mkdir(targetDir, 0755)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create target dir: %v", err)
|
||||
}
|
||||
|
||||
// Create symlink
|
||||
linkPath := filepath.Join(tempDir, "link")
|
||||
err = os.Symlink(targetDir, linkPath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create symlink: %v", err)
|
||||
}
|
||||
|
||||
// Create config with symlink path
|
||||
configContent := `---
|
||||
editor: vim
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: ` + linkPath + `
|
||||
readonly: true
|
||||
`
|
||||
configFile := filepath.Join(tempDir, "config.yml")
|
||||
err = os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Load config with symlink resolution
|
||||
conf, err := New(map[string]interface{}{}, configFile, true)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Verify symlink was resolved
|
||||
if len(conf.Cheatpaths) > 0 && conf.Cheatpaths[0].Path != targetDir {
|
||||
t.Errorf("expected symlink to be resolved to %s, got %s", targetDir, conf.Cheatpaths[0].Path)
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigBrokenSymlink tests broken symlink handling
|
||||
func TestConfigBrokenSymlink(t *testing.T) {
|
||||
// Create temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create broken symlink
|
||||
linkPath := filepath.Join(tempDir, "broken-link")
|
||||
err = os.Symlink("/nonexistent/path", linkPath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create symlink: %v", err)
|
||||
}
|
||||
|
||||
// Create config with broken symlink
|
||||
configContent := `---
|
||||
editor: vim
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: ` + linkPath + `
|
||||
readonly: true
|
||||
`
|
||||
configFile := filepath.Join(tempDir, "config.yml")
|
||||
err = os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Load config with symlink resolution should fail
|
||||
_, err = New(map[string]interface{}{}, configFile, true)
|
||||
if err == nil {
|
||||
t.Error("expected error for broken symlink, got nil")
|
||||
}
|
||||
}
|
||||
|
||||
// TestConfigTildeExpansionError tests tilde expansion error handling
|
||||
func TestConfigTildeExpansionError(t *testing.T) {
|
||||
// This is tricky to test without mocking homedir.Expand
|
||||
// We'll create a config with an invalid home reference
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create config with user that likely doesn't exist
|
||||
configContent := `---
|
||||
editor: vim
|
||||
cheatpaths:
|
||||
- name: test
|
||||
path: ~nonexistentuser12345/cheat
|
||||
readonly: true
|
||||
`
|
||||
configFile := filepath.Join(tempDir, "config.yml")
|
||||
err = os.WriteFile(configFile, []byte(configContent), 0644)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to write config: %v", err)
|
||||
}
|
||||
|
||||
// Load config - this may or may not fail depending on the system
|
||||
// but we're testing that it doesn't panic
|
||||
_, _ = New(map[string]interface{}{}, configFile, false)
|
||||
}
|
||||
|
||||
// TestConfigGetCwdError tests error handling when os.Getwd fails
|
||||
func TestConfigGetCwdError(t *testing.T) {
|
||||
// This is difficult to test without being able to break os.Getwd
|
||||
// We'll create a scenario where the current directory is removed
|
||||
|
||||
// Create and enter a temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-config-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
|
||||
oldCwd, err := os.Getwd()
|
||||
if err != nil {
|
||||
t.Fatalf("failed to get cwd: %v", err)
|
||||
}
|
||||
defer os.Chdir(oldCwd)
|
||||
|
||||
err = os.Chdir(tempDir)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to change dir: %v", err)
|
||||
}
|
||||
|
||||
// Remove the directory we're in
|
||||
err = os.RemoveAll(tempDir)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to remove temp dir: %v", err)
|
||||
}
|
||||
|
||||
// Now os.Getwd should fail
|
||||
_, err = New(map[string]interface{}{}, mock.Path("conf/empty.yml"), false)
|
||||
// This might not fail on all systems, so we just ensure no panic
|
||||
_ = err
|
||||
}
|
||||
52
internal/config/doc.go
Normal file
52
internal/config/doc.go
Normal file
@@ -0,0 +1,52 @@
|
||||
// Package config manages application configuration and settings.
|
||||
//
|
||||
// The config package provides functionality to:
|
||||
// - Load configuration from YAML files
|
||||
// - Validate configuration values
|
||||
// - Manage platform-specific configuration paths
|
||||
// - Handle editor and pager settings
|
||||
// - Configure colorization and formatting options
|
||||
//
|
||||
// # Configuration Structure
|
||||
//
|
||||
// The main configuration file (conf.yml) contains:
|
||||
// - Editor preferences
|
||||
// - Pager settings
|
||||
// - Colorization options
|
||||
// - Cheatpath definitions
|
||||
// - Formatting preferences
|
||||
//
|
||||
// Example configuration:
|
||||
//
|
||||
// ---
|
||||
// editor: vim
|
||||
// colorize: true
|
||||
// style: monokai
|
||||
// formatter: terminal256
|
||||
// pager: less -FRX
|
||||
// cheatpaths:
|
||||
// - name: personal
|
||||
// path: ~/cheat
|
||||
// tags: []
|
||||
// readonly: false
|
||||
// - name: community
|
||||
// path: ~/cheat/.cheat
|
||||
// tags: [community]
|
||||
// readonly: true
|
||||
//
|
||||
// # Platform-Specific Paths
|
||||
//
|
||||
// The package automatically detects configuration paths based on the operating system:
|
||||
// - Linux/Unix: $XDG_CONFIG_HOME/cheat/conf.yml or ~/.config/cheat/conf.yml
|
||||
// - macOS: ~/Library/Application Support/cheat/conf.yml
|
||||
// - Windows: %APPDATA%\cheat\conf.yml
|
||||
//
|
||||
// # Environment Variables
|
||||
//
|
||||
// The following environment variables are respected:
|
||||
// - CHEAT_CONFIG_PATH: Override the configuration file location
|
||||
// - CHEAT_USE_FZF: Enable fzf integration when set to "true"
|
||||
// - EDITOR: Default editor if not specified in config
|
||||
// - VISUAL: Fallback editor if EDITOR is not set
|
||||
// - PAGER: Default pager if not specified in config
|
||||
package config
|
||||
95
internal/config/editor_test.go
Normal file
95
internal/config/editor_test.go
Normal file
@@ -0,0 +1,95 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"runtime"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestEditor tests the Editor function
|
||||
func TestEditor(t *testing.T) {
|
||||
// Save original env vars
|
||||
oldVisual := os.Getenv("VISUAL")
|
||||
oldEditor := os.Getenv("EDITOR")
|
||||
defer func() {
|
||||
os.Setenv("VISUAL", oldVisual)
|
||||
os.Setenv("EDITOR", oldEditor)
|
||||
}()
|
||||
|
||||
t.Run("windows default", func(t *testing.T) {
|
||||
if runtime.GOOS != "windows" {
|
||||
t.Skip("skipping windows test on non-windows platform")
|
||||
}
|
||||
|
||||
// Clear env vars
|
||||
os.Setenv("VISUAL", "")
|
||||
os.Setenv("EDITOR", "")
|
||||
|
||||
editor, err := Editor()
|
||||
if err != nil {
|
||||
t.Errorf("unexpected error: %v", err)
|
||||
}
|
||||
if editor != "notepad" {
|
||||
t.Errorf("expected 'notepad' on windows, got %s", editor)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("VISUAL takes precedence", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("VISUAL", "emacs")
|
||||
os.Setenv("EDITOR", "nano")
|
||||
|
||||
editor, err := Editor()
|
||||
if err != nil {
|
||||
t.Errorf("unexpected error: %v", err)
|
||||
}
|
||||
if editor != "emacs" {
|
||||
t.Errorf("expected VISUAL to take precedence, got %s", editor)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("EDITOR when no VISUAL", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("VISUAL", "")
|
||||
os.Setenv("EDITOR", "vim")
|
||||
|
||||
editor, err := Editor()
|
||||
if err != nil {
|
||||
t.Errorf("unexpected error: %v", err)
|
||||
}
|
||||
if editor != "vim" {
|
||||
t.Errorf("expected EDITOR value, got %s", editor)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("no editor found error", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
// Clear all environment variables
|
||||
os.Setenv("VISUAL", "")
|
||||
os.Setenv("EDITOR", "")
|
||||
|
||||
// Create a custom PATH that doesn't include common editors
|
||||
oldPath := os.Getenv("PATH")
|
||||
defer os.Setenv("PATH", oldPath)
|
||||
|
||||
// Set a very limited PATH that won't have editors
|
||||
os.Setenv("PATH", "/nonexistent")
|
||||
|
||||
editor, err := Editor()
|
||||
|
||||
// If we found an editor, it's likely in the system
|
||||
// This test might not always produce an error on systems with editors
|
||||
if editor == "" && err == nil {
|
||||
t.Error("expected error when no editor found")
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -2,6 +2,8 @@ package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
@@ -35,3 +37,84 @@ func TestInit(t *testing.T) {
|
||||
t.Errorf("failed to write configs: want: %s, got: %s", conf, got)
|
||||
}
|
||||
}
|
||||
|
||||
// TestInitCreateDirectory tests that Init creates the directory if it doesn't exist
|
||||
func TestInitCreateDirectory(t *testing.T) {
|
||||
// Create a temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-init-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Path to a config file in a non-existent subdirectory
|
||||
confPath := filepath.Join(tempDir, "subdir", "conf.yml")
|
||||
|
||||
// Initialize the config file
|
||||
conf := "test config"
|
||||
if err = Init(confPath, conf); err != nil {
|
||||
t.Errorf("failed to init config file: %v", err)
|
||||
}
|
||||
|
||||
// Verify the directory was created
|
||||
if _, err := os.Stat(filepath.Dir(confPath)); os.IsNotExist(err) {
|
||||
t.Error("Init did not create the directory")
|
||||
}
|
||||
|
||||
// Verify the file was created with correct content
|
||||
bytes, err := os.ReadFile(confPath)
|
||||
if err != nil {
|
||||
t.Errorf("failed to read config file: %v", err)
|
||||
}
|
||||
if string(bytes) != conf {
|
||||
t.Errorf("config content mismatch: got %q, want %q", string(bytes), conf)
|
||||
}
|
||||
}
|
||||
|
||||
// TestInitWriteError tests error handling when file write fails
|
||||
func TestInitWriteError(t *testing.T) {
|
||||
// Skip this test if running as root (can write anywhere)
|
||||
if os.Getuid() == 0 {
|
||||
t.Skip("Cannot test write errors as root")
|
||||
}
|
||||
|
||||
// Try to write to a read-only directory
|
||||
err := Init("/dev/null/impossible/path/conf.yml", "test")
|
||||
if err == nil {
|
||||
t.Error("expected error when writing to invalid path, got nil")
|
||||
}
|
||||
if err != nil && !strings.Contains(err.Error(), "failed to create") {
|
||||
t.Errorf("expected 'failed to create' error, got: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// TestInitExistingFile tests that Init overwrites existing files
|
||||
func TestInitExistingFile(t *testing.T) {
|
||||
// Create a temp file
|
||||
tempFile, err := os.CreateTemp("", "cheat-init-existing-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
defer os.Remove(tempFile.Name())
|
||||
|
||||
// Write initial content
|
||||
initialContent := "initial content"
|
||||
if err := os.WriteFile(tempFile.Name(), []byte(initialContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write initial content: %v", err)
|
||||
}
|
||||
|
||||
// Initialize with new content
|
||||
newContent := "new config content"
|
||||
if err = Init(tempFile.Name(), newContent); err != nil {
|
||||
t.Errorf("failed to init over existing file: %v", err)
|
||||
}
|
||||
|
||||
// Verify the file was overwritten
|
||||
bytes, err := os.ReadFile(tempFile.Name())
|
||||
if err != nil {
|
||||
t.Errorf("failed to read config file: %v", err)
|
||||
}
|
||||
if string(bytes) != newContent {
|
||||
t.Errorf("config not overwritten: got %q, want %q", string(bytes), newContent)
|
||||
}
|
||||
}
|
||||
|
||||
125
internal/config/new_test.go
Normal file
125
internal/config/new_test.go
Normal file
@@ -0,0 +1,125 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestNewTrimsWhitespace(t *testing.T) {
|
||||
// Create a temporary config file with whitespace in editor and pager
|
||||
tmpDir := t.TempDir()
|
||||
configPath := filepath.Join(tmpDir, "config.yml")
|
||||
|
||||
configContent := `---
|
||||
editor: " vim -c 'set number' "
|
||||
pager: " less -R "
|
||||
style: monokai
|
||||
formatter: terminal
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: ~/cheat
|
||||
tags: []
|
||||
readonly: false
|
||||
`
|
||||
|
||||
if err := os.WriteFile(configPath, []byte(configContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write test config: %v", err)
|
||||
}
|
||||
|
||||
// Load the config
|
||||
conf, err := New(map[string]interface{}{}, configPath, false)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Verify editor is trimmed
|
||||
expectedEditor := "vim -c 'set number'"
|
||||
if conf.Editor != expectedEditor {
|
||||
t.Errorf("editor not properly trimmed: got %q, want %q", conf.Editor, expectedEditor)
|
||||
}
|
||||
|
||||
// Verify pager is trimmed
|
||||
expectedPager := "less -R"
|
||||
if conf.Pager != expectedPager {
|
||||
t.Errorf("pager not properly trimmed: got %q, want %q", conf.Pager, expectedPager)
|
||||
}
|
||||
}
|
||||
|
||||
func TestNewEmptyEditorFallback(t *testing.T) {
|
||||
// Skip if required environment variables would interfere
|
||||
oldVisual := os.Getenv("VISUAL")
|
||||
oldEditor := os.Getenv("EDITOR")
|
||||
os.Unsetenv("VISUAL")
|
||||
os.Unsetenv("EDITOR")
|
||||
defer func() {
|
||||
os.Setenv("VISUAL", oldVisual)
|
||||
os.Setenv("EDITOR", oldEditor)
|
||||
}()
|
||||
|
||||
// Create a config with whitespace-only editor
|
||||
tmpDir := t.TempDir()
|
||||
configPath := filepath.Join(tmpDir, "config.yml")
|
||||
|
||||
configContent := `---
|
||||
editor: " "
|
||||
pager: less
|
||||
style: monokai
|
||||
formatter: terminal
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: ~/cheat
|
||||
tags: []
|
||||
readonly: false
|
||||
`
|
||||
|
||||
if err := os.WriteFile(configPath, []byte(configContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write test config: %v", err)
|
||||
}
|
||||
|
||||
// Load the config
|
||||
conf, err := New(map[string]interface{}{}, configPath, false)
|
||||
if err != nil {
|
||||
// It's OK if this fails due to no editor being found
|
||||
// The important thing is it doesn't panic
|
||||
return
|
||||
}
|
||||
|
||||
// If it succeeded, editor should not be empty (fallback was used)
|
||||
if conf.Editor == "" {
|
||||
t.Error("editor should not be empty after fallback")
|
||||
}
|
||||
}
|
||||
|
||||
func TestNewWhitespaceOnlyPager(t *testing.T) {
|
||||
// Create a config with whitespace-only pager
|
||||
tmpDir := t.TempDir()
|
||||
configPath := filepath.Join(tmpDir, "config.yml")
|
||||
|
||||
configContent := `---
|
||||
editor: vim
|
||||
pager: " "
|
||||
style: monokai
|
||||
formatter: terminal
|
||||
cheatpaths:
|
||||
- name: personal
|
||||
path: ~/cheat
|
||||
tags: []
|
||||
readonly: false
|
||||
`
|
||||
|
||||
if err := os.WriteFile(configPath, []byte(configContent), 0644); err != nil {
|
||||
t.Fatalf("failed to write test config: %v", err)
|
||||
}
|
||||
|
||||
// Load the config
|
||||
conf, err := New(map[string]interface{}{}, configPath, false)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to load config: %v", err)
|
||||
}
|
||||
|
||||
// Pager should be empty after trimming
|
||||
if conf.Pager != "" {
|
||||
t.Errorf("pager should be empty after trimming whitespace: got %q", conf.Pager)
|
||||
}
|
||||
}
|
||||
@@ -22,7 +22,7 @@ func Pager() string {
|
||||
// Otherwise, search for `pager`, `less`, and `more` on the `$PATH`. If
|
||||
// none are found, return an empty pager.
|
||||
for _, pager := range []string{"pager", "less", "more"} {
|
||||
if path, err := exec.LookPath(pager); err != nil {
|
||||
if path, err := exec.LookPath(pager); err == nil {
|
||||
return path
|
||||
}
|
||||
}
|
||||
|
||||
90
internal/config/pager_test.go
Normal file
90
internal/config/pager_test.go
Normal file
@@ -0,0 +1,90 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"os"
|
||||
"runtime"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestPager tests the Pager function
|
||||
func TestPager(t *testing.T) {
|
||||
// Save original env var
|
||||
oldPager := os.Getenv("PAGER")
|
||||
defer os.Setenv("PAGER", oldPager)
|
||||
|
||||
t.Run("windows default", func(t *testing.T) {
|
||||
if runtime.GOOS != "windows" {
|
||||
t.Skip("skipping windows test on non-windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "")
|
||||
pager := Pager()
|
||||
if pager != "more" {
|
||||
t.Errorf("expected 'more' on windows, got %s", pager)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("PAGER env var", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "bat")
|
||||
pager := Pager()
|
||||
if pager != "bat" {
|
||||
t.Errorf("expected PAGER env var value, got %s", pager)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("fallback to system pager", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "")
|
||||
pager := Pager()
|
||||
|
||||
// Should find one of the fallback pagers or return empty string
|
||||
validPagers := map[string]bool{
|
||||
"": true, // no pager found
|
||||
"pager": true,
|
||||
"less": true,
|
||||
"more": true,
|
||||
}
|
||||
|
||||
// Check if it's a path to one of these
|
||||
found := false
|
||||
for p := range validPagers {
|
||||
if p == "" && pager == "" {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
if p != "" && (pager == p || len(pager) >= len(p) && pager[len(pager)-len(p):] == p) {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
t.Errorf("unexpected pager value: %s", pager)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("no pager available", func(t *testing.T) {
|
||||
if runtime.GOOS == "windows" {
|
||||
t.Skip("skipping non-windows test on windows platform")
|
||||
}
|
||||
|
||||
os.Setenv("PAGER", "")
|
||||
|
||||
// Save and modify PATH to ensure no pagers are found
|
||||
oldPath := os.Getenv("PATH")
|
||||
defer os.Setenv("PATH", oldPath)
|
||||
os.Setenv("PATH", "/nonexistent")
|
||||
|
||||
pager := Pager()
|
||||
if pager != "" {
|
||||
t.Errorf("expected empty string when no pager found, got %s", pager)
|
||||
}
|
||||
})
|
||||
}
|
||||
45
internal/display/doc.go
Normal file
45
internal/display/doc.go
Normal file
@@ -0,0 +1,45 @@
|
||||
// Package display handles output formatting and presentation for the cheat application.
|
||||
//
|
||||
// The display package provides utilities for:
|
||||
// - Writing output to stdout or a pager
|
||||
// - Formatting text with indentation
|
||||
// - Creating faint (dimmed) text for de-emphasis
|
||||
// - Managing colored output
|
||||
//
|
||||
// # Pager Integration
|
||||
//
|
||||
// The package integrates with system pagers (less, more, etc.) to handle
|
||||
// long output. If a pager is configured and the output is to a terminal,
|
||||
// content is automatically piped through the pager.
|
||||
//
|
||||
// # Text Formatting
|
||||
//
|
||||
// Various formatting utilities are provided:
|
||||
// - Faint: Creates dimmed text using ANSI escape codes
|
||||
// - Indent: Adds consistent indentation to text blocks
|
||||
// - Write: Intelligent output that uses stdout or pager as appropriate
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Write output, using pager if configured
|
||||
// if err := display.Write(output, config); err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Create faint text for de-emphasis
|
||||
// fainted := display.Faint("(read-only)", config)
|
||||
//
|
||||
// // Indent a block of text
|
||||
// indented := display.Indent(text, " ")
|
||||
//
|
||||
// # Color Support
|
||||
//
|
||||
// The package respects the colorization settings from the config.
|
||||
// When colorization is disabled, formatting functions like Faint
|
||||
// return unmodified text.
|
||||
//
|
||||
// # Terminal Detection
|
||||
//
|
||||
// The package uses isatty to detect if output is to a terminal,
|
||||
// which affects decisions about using a pager and applying colors.
|
||||
package display
|
||||
@@ -19,6 +19,11 @@ func Write(out string, conf config.Config) {
|
||||
}
|
||||
|
||||
// otherwise, pipe output through the pager
|
||||
writeToPager(out, conf)
|
||||
}
|
||||
|
||||
// writeToPager writes output through a pager command
|
||||
func writeToPager(out string, conf config.Config) {
|
||||
parts := strings.Split(conf.Pager, " ")
|
||||
pager := parts[0]
|
||||
args := parts[1:]
|
||||
|
||||
136
internal/display/write_test.go
Normal file
136
internal/display/write_test.go
Normal file
@@ -0,0 +1,136 @@
|
||||
package display
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"io"
|
||||
"os"
|
||||
"os/exec"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/config"
|
||||
)
|
||||
|
||||
// TestWriteToPager tests the writeToPager function
|
||||
func TestWriteToPager(t *testing.T) {
|
||||
// Skip these tests in CI/CD environments where interactive commands might not work
|
||||
if os.Getenv("CI") != "" {
|
||||
t.Skip("Skipping pager tests in CI environment")
|
||||
}
|
||||
|
||||
// Note: We can't easily test os.Exit calls, so we focus on testing writeToPager
|
||||
// which contains the core logic
|
||||
|
||||
t.Run("successful pager execution", func(t *testing.T) {
|
||||
// Save original stdout
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
// Create pipe for capturing output
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdout = w
|
||||
|
||||
// Use 'cat' as a simple pager that just outputs input
|
||||
conf := config.Config{
|
||||
Pager: "cat",
|
||||
}
|
||||
|
||||
// This will call os.Exit on error, so we need to be careful
|
||||
// We're using 'cat' which should always succeed
|
||||
input := "Test output\n"
|
||||
|
||||
// Run in a goroutine to avoid blocking
|
||||
done := make(chan bool)
|
||||
go func() {
|
||||
writeToPager(input, conf)
|
||||
done <- true
|
||||
}()
|
||||
|
||||
// Wait for completion or timeout
|
||||
select {
|
||||
case <-done:
|
||||
// Success
|
||||
}
|
||||
|
||||
// Close write end and read output
|
||||
w.Close()
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, r)
|
||||
|
||||
// Verify output
|
||||
if buf.String() != input {
|
||||
t.Errorf("expected output %q, got %q", input, buf.String())
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("pager with arguments", func(t *testing.T) {
|
||||
// Save original stdout
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
// Create pipe for capturing output
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdout = w
|
||||
|
||||
// Use 'cat' with '-A' flag (shows non-printing characters)
|
||||
conf := config.Config{
|
||||
Pager: "cat -A",
|
||||
}
|
||||
|
||||
input := "Test\toutput\n"
|
||||
|
||||
// Run in a goroutine
|
||||
done := make(chan bool)
|
||||
go func() {
|
||||
writeToPager(input, conf)
|
||||
done <- true
|
||||
}()
|
||||
|
||||
// Wait for completion
|
||||
select {
|
||||
case <-done:
|
||||
// Success
|
||||
}
|
||||
|
||||
// Close write end and read output
|
||||
w.Close()
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, r)
|
||||
|
||||
// cat -A shows tabs as ^I and line endings as $
|
||||
expected := "Test^Ioutput$\n"
|
||||
if buf.String() != expected {
|
||||
t.Errorf("expected output %q, got %q", expected, buf.String())
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// TestWriteToPagerError tests error handling in writeToPager
|
||||
func TestWriteToPagerError(t *testing.T) {
|
||||
if os.Getenv("TEST_PAGER_ERROR_SUBPROCESS") == "1" {
|
||||
// This is the subprocess - run the actual test
|
||||
conf := config.Config{Pager: "/nonexistent/command"}
|
||||
writeToPager("test", conf)
|
||||
return
|
||||
}
|
||||
|
||||
// Run test in subprocess to handle os.Exit
|
||||
cmd := exec.Command(os.Args[0], "-test.run=^TestWriteToPagerError$")
|
||||
cmd.Env = append(os.Environ(), "TEST_PAGER_ERROR_SUBPROCESS=1")
|
||||
|
||||
output, err := cmd.CombinedOutput()
|
||||
|
||||
// Should exit with error
|
||||
if err == nil {
|
||||
t.Error("expected process to exit with error")
|
||||
}
|
||||
|
||||
// Should contain error message
|
||||
if !strings.Contains(string(output), "failed to write to pager") {
|
||||
t.Errorf("expected error message about pager failure, got %q", string(output))
|
||||
}
|
||||
}
|
||||
180
internal/installer/prompt_test.go
Normal file
180
internal/installer/prompt_test.go
Normal file
@@ -0,0 +1,180 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestPrompt(t *testing.T) {
|
||||
// Save original stdin/stdout
|
||||
oldStdin := os.Stdin
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
prompt string
|
||||
input string
|
||||
defaultVal bool
|
||||
want bool
|
||||
wantErr bool
|
||||
wantPrompt string
|
||||
}{
|
||||
{
|
||||
name: "answer yes",
|
||||
prompt: "Continue?",
|
||||
input: "y\n",
|
||||
defaultVal: false,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer yes with uppercase",
|
||||
prompt: "Continue?",
|
||||
input: "Y\n",
|
||||
defaultVal: false,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer yes with spaces",
|
||||
prompt: "Continue?",
|
||||
input: " y \n",
|
||||
defaultVal: false,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer no",
|
||||
prompt: "Continue?",
|
||||
input: "n\n",
|
||||
defaultVal: true,
|
||||
want: false,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "answer no with any text",
|
||||
prompt: "Continue?",
|
||||
input: "anything\n",
|
||||
defaultVal: true,
|
||||
want: false,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "empty answer uses default true",
|
||||
prompt: "Continue?",
|
||||
input: "\n",
|
||||
defaultVal: true,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "empty answer uses default false",
|
||||
prompt: "Continue?",
|
||||
input: "\n",
|
||||
defaultVal: false,
|
||||
want: false,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
{
|
||||
name: "whitespace answer uses default",
|
||||
prompt: "Continue?",
|
||||
input: " \n",
|
||||
defaultVal: true,
|
||||
want: true,
|
||||
wantPrompt: "Continue?: ",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
// Create a pipe for stdin
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
|
||||
// Create a pipe for stdout to capture the prompt
|
||||
rOut, wOut, _ := os.Pipe()
|
||||
os.Stdout = wOut
|
||||
|
||||
// Write input to stdin
|
||||
go func() {
|
||||
defer w.Close()
|
||||
io.WriteString(w, tt.input)
|
||||
}()
|
||||
|
||||
// Call the function
|
||||
got, err := Prompt(tt.prompt, tt.defaultVal)
|
||||
|
||||
// Close stdout write end and read the prompt
|
||||
wOut.Close()
|
||||
var buf bytes.Buffer
|
||||
io.Copy(&buf, rOut)
|
||||
|
||||
// Check error
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Prompt() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
|
||||
// Check result
|
||||
if got != tt.want {
|
||||
t.Errorf("Prompt() = %v, want %v", got, tt.want)
|
||||
}
|
||||
|
||||
// Check that prompt was displayed correctly
|
||||
if buf.String() != tt.wantPrompt {
|
||||
t.Errorf("Prompt display = %q, want %q", buf.String(), tt.wantPrompt)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestPromptError(t *testing.T) {
|
||||
// Save original stdin
|
||||
oldStdin := os.Stdin
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
}()
|
||||
|
||||
// Create a pipe and close it immediately to simulate read error
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
r.Close()
|
||||
w.Close()
|
||||
|
||||
// This should cause a read error
|
||||
_, err := Prompt("Test?", false)
|
||||
if err == nil {
|
||||
t.Error("expected error when reading from closed stdin, got nil")
|
||||
}
|
||||
if !strings.Contains(err.Error(), "failed to parse input") {
|
||||
t.Errorf("expected 'failed to parse input' error, got: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// TestPromptIntegration provides a simple integration test
|
||||
func TestPromptIntegration(t *testing.T) {
|
||||
// This demonstrates how the prompt would be used in practice
|
||||
// It's skipped by default since it requires actual user input
|
||||
if os.Getenv("TEST_INTERACTIVE") != "1" {
|
||||
t.Skip("Skipping interactive test - set TEST_INTERACTIVE=1 to run")
|
||||
}
|
||||
|
||||
fmt.Println("\n=== Interactive Prompt Test ===")
|
||||
fmt.Println("You will be prompted to answer a question.")
|
||||
fmt.Println("Try different inputs: y, n, Y, N, empty (just press Enter)")
|
||||
|
||||
result, err := Prompt("Would you like to continue? [Y/n]", true)
|
||||
if err != nil {
|
||||
t.Fatalf("Prompt failed: %v", err)
|
||||
}
|
||||
|
||||
fmt.Printf("You answered: %v\n", result)
|
||||
}
|
||||
236
internal/installer/run_test.go
Normal file
236
internal/installer/run_test.go
Normal file
@@ -0,0 +1,236 @@
|
||||
package installer
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestRun(t *testing.T) {
|
||||
// Create a temporary directory for testing
|
||||
tempDir, err := os.MkdirTemp("", "cheat-installer-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Save original stdin/stdout
|
||||
oldStdin := os.Stdin
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
configs string
|
||||
confpath string
|
||||
userInput string
|
||||
wantErr bool
|
||||
wantInErr string
|
||||
checkFiles []string
|
||||
dontWantFiles []string
|
||||
}{
|
||||
{
|
||||
name: "user declines community cheatsheets",
|
||||
configs: `---
|
||||
editor: EDITOR_PATH
|
||||
pager: PAGER_PATH
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
tags: [ community ]
|
||||
readonly: true
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
tags: [ personal ]
|
||||
readonly: false
|
||||
`,
|
||||
confpath: filepath.Join(tempDir, "conf1", "conf.yml"),
|
||||
userInput: "n\n",
|
||||
wantErr: false,
|
||||
checkFiles: []string{"conf1/conf.yml"},
|
||||
dontWantFiles: []string{"conf1/cheatsheets/community", "conf1/cheatsheets/personal"},
|
||||
},
|
||||
{
|
||||
name: "user accepts but clone fails",
|
||||
configs: `---
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
`,
|
||||
confpath: filepath.Join(tempDir, "conf2", "conf.yml"),
|
||||
userInput: "y\n",
|
||||
wantErr: true,
|
||||
wantInErr: "failed to clone cheatsheets",
|
||||
},
|
||||
{
|
||||
name: "invalid config path",
|
||||
configs: "test",
|
||||
confpath: "/nonexistent/path/conf.yml",
|
||||
userInput: "n\n",
|
||||
wantErr: true,
|
||||
wantInErr: "failed to create config file",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
// Create stdin pipe
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
|
||||
// Create stdout pipe to suppress output
|
||||
_, wOut, _ := os.Pipe()
|
||||
os.Stdout = wOut
|
||||
|
||||
// Write user input
|
||||
go func() {
|
||||
defer w.Close()
|
||||
io.WriteString(w, tt.userInput)
|
||||
}()
|
||||
|
||||
// Run the installer
|
||||
err := Run(tt.configs, tt.confpath)
|
||||
|
||||
// Close pipes
|
||||
wOut.Close()
|
||||
|
||||
// Check error
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Run() error = %v, wantErr %v", err, tt.wantErr)
|
||||
}
|
||||
if err != nil && tt.wantInErr != "" && !strings.Contains(err.Error(), tt.wantInErr) {
|
||||
t.Errorf("Run() error = %v, want error containing %q", err, tt.wantInErr)
|
||||
}
|
||||
|
||||
// Check created files
|
||||
for _, file := range tt.checkFiles {
|
||||
path := filepath.Join(tempDir, file)
|
||||
if _, err := os.Stat(path); os.IsNotExist(err) {
|
||||
t.Errorf("expected file %s to exist, but it doesn't", path)
|
||||
}
|
||||
}
|
||||
|
||||
// Check files that shouldn't exist
|
||||
for _, file := range tt.dontWantFiles {
|
||||
path := filepath.Join(tempDir, file)
|
||||
if _, err := os.Stat(path); err == nil {
|
||||
t.Errorf("expected file %s to not exist, but it does", path)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRunPromptError(t *testing.T) {
|
||||
// Save original stdin
|
||||
oldStdin := os.Stdin
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
}()
|
||||
|
||||
// Close stdin to cause prompt error
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
r.Close()
|
||||
w.Close()
|
||||
|
||||
tempDir, _ := os.MkdirTemp("", "cheat-installer-prompt-test-*")
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
err := Run("test", filepath.Join(tempDir, "conf.yml"))
|
||||
if err == nil {
|
||||
t.Error("expected error when prompt fails, got nil")
|
||||
}
|
||||
if !strings.Contains(err.Error(), "failed to prompt") {
|
||||
t.Errorf("expected 'failed to prompt' error, got: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
func TestRunStringReplacements(t *testing.T) {
|
||||
// Test that path replacements work correctly
|
||||
configs := `---
|
||||
editor: EDITOR_PATH
|
||||
pager: PAGER_PATH
|
||||
cheatpaths:
|
||||
- name: community
|
||||
path: COMMUNITY_PATH
|
||||
- name: personal
|
||||
path: PERSONAL_PATH
|
||||
`
|
||||
|
||||
// Create temp directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-installer-replace-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
confpath := filepath.Join(tempDir, "conf.yml")
|
||||
confdir := filepath.Dir(confpath)
|
||||
|
||||
// Expected paths
|
||||
expectedCommunity := filepath.Join(confdir, "cheatsheets", "community")
|
||||
expectedPersonal := filepath.Join(confdir, "cheatsheets", "personal")
|
||||
|
||||
// Save original stdin/stdout
|
||||
oldStdin := os.Stdin
|
||||
oldStdout := os.Stdout
|
||||
defer func() {
|
||||
os.Stdin = oldStdin
|
||||
os.Stdout = oldStdout
|
||||
}()
|
||||
|
||||
// Create stdin pipe with "n" answer
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdin = r
|
||||
go func() {
|
||||
defer w.Close()
|
||||
io.WriteString(w, "n\n")
|
||||
}()
|
||||
|
||||
// Suppress stdout
|
||||
_, wOut, _ := os.Pipe()
|
||||
os.Stdout = wOut
|
||||
defer wOut.Close()
|
||||
|
||||
// Run installer
|
||||
err = Run(configs, confpath)
|
||||
if err != nil {
|
||||
t.Fatalf("Run() failed: %v", err)
|
||||
}
|
||||
|
||||
// Read the created config file
|
||||
content, err := os.ReadFile(confpath)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to read config file: %v", err)
|
||||
}
|
||||
|
||||
// Check replacements
|
||||
contentStr := string(content)
|
||||
if strings.Contains(contentStr, "COMMUNITY_PATH") {
|
||||
t.Error("COMMUNITY_PATH was not replaced")
|
||||
}
|
||||
if strings.Contains(contentStr, "PERSONAL_PATH") {
|
||||
t.Error("PERSONAL_PATH was not replaced")
|
||||
}
|
||||
if strings.Contains(contentStr, "EDITOR_PATH") && !strings.Contains(contentStr, fmt.Sprintf("editor: %s", "")) {
|
||||
t.Error("EDITOR_PATH was not replaced")
|
||||
}
|
||||
if strings.Contains(contentStr, "PAGER_PATH") && !strings.Contains(contentStr, fmt.Sprintf("pager: %s", "")) {
|
||||
t.Error("PAGER_PATH was not replaced")
|
||||
}
|
||||
|
||||
// Verify correct paths were used
|
||||
if !strings.Contains(contentStr, expectedCommunity) {
|
||||
t.Errorf("expected community path %q in config", expectedCommunity)
|
||||
}
|
||||
if !strings.Contains(contentStr, expectedPersonal) {
|
||||
t.Errorf("expected personal path %q in config", expectedPersonal)
|
||||
}
|
||||
}
|
||||
@@ -8,11 +8,11 @@ import (
|
||||
"github.com/go-git/go-git/v5"
|
||||
)
|
||||
|
||||
// Clone clones the repo available at `url`
|
||||
func Clone(url string) error {
|
||||
// Clone clones the community cheatsheets repository to the specified directory
|
||||
func Clone(dir string) error {
|
||||
|
||||
// clone the community cheatsheets
|
||||
_, err := git.PlainClone(url, false, &git.CloneOptions{
|
||||
_, err := git.PlainClone(dir, false, &git.CloneOptions{
|
||||
URL: "https://github.com/cheat/cheatsheets.git",
|
||||
Depth: 1,
|
||||
Progress: os.Stdout,
|
||||
|
||||
80
internal/repo/clone_integration_test.go
Normal file
80
internal/repo/clone_integration_test.go
Normal file
@@ -0,0 +1,80 @@
|
||||
//go:build integration
|
||||
// +build integration
|
||||
|
||||
package repo
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestCloneIntegration performs a real clone operation to verify functionality
|
||||
// Run with: go test -tags=integration ./internal/repo -v -run TestCloneIntegration
|
||||
func TestCloneIntegration(t *testing.T) {
|
||||
if testing.Short() {
|
||||
t.Skip("Skipping integration test in short mode")
|
||||
}
|
||||
|
||||
// Create a temporary directory
|
||||
tmpDir, err := os.MkdirTemp("", "cheat-clone-integration-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
|
||||
destDir := filepath.Join(tmpDir, "cheatsheets")
|
||||
|
||||
t.Logf("Cloning to: %s", destDir)
|
||||
|
||||
// Perform the actual clone
|
||||
err = Clone(destDir)
|
||||
if err != nil {
|
||||
t.Fatalf("Clone() failed: %v", err)
|
||||
}
|
||||
|
||||
// Verify the clone succeeded
|
||||
info, err := os.Stat(destDir)
|
||||
if err != nil {
|
||||
t.Fatalf("destination directory not created: %v", err)
|
||||
}
|
||||
|
||||
if !info.IsDir() {
|
||||
t.Fatal("destination is not a directory")
|
||||
}
|
||||
|
||||
// Check for .git directory
|
||||
gitDir := filepath.Join(destDir, ".git")
|
||||
if _, err := os.Stat(gitDir); err != nil {
|
||||
t.Error(".git directory not found")
|
||||
}
|
||||
|
||||
// Check for some expected cheatsheets
|
||||
expectedFiles := []string{
|
||||
"bash", // bash cheatsheet should exist
|
||||
"git", // git cheatsheet should exist
|
||||
"ls", // ls cheatsheet should exist
|
||||
}
|
||||
|
||||
foundCount := 0
|
||||
for _, file := range expectedFiles {
|
||||
path := filepath.Join(destDir, file)
|
||||
if _, err := os.Stat(path); err == nil {
|
||||
foundCount++
|
||||
}
|
||||
}
|
||||
|
||||
if foundCount < 2 {
|
||||
t.Errorf("expected at least 2 common cheatsheets, found %d", foundCount)
|
||||
}
|
||||
|
||||
t.Log("Clone integration test passed!")
|
||||
|
||||
// Test cloning to existing directory (should fail)
|
||||
err = Clone(destDir)
|
||||
if err == nil {
|
||||
t.Error("expected error when cloning to existing repository, got nil")
|
||||
} else {
|
||||
t.Logf("Expected error when cloning to existing dir: %v", err)
|
||||
}
|
||||
}
|
||||
49
internal/repo/clone_test.go
Normal file
49
internal/repo/clone_test.go
Normal file
@@ -0,0 +1,49 @@
|
||||
package repo
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestClone tests the Clone function
|
||||
func TestClone(t *testing.T) {
|
||||
// This test requires network access, so we'll only test error cases
|
||||
// that don't require actual cloning
|
||||
|
||||
t.Run("clone to read-only directory", func(t *testing.T) {
|
||||
if os.Getuid() == 0 {
|
||||
t.Skip("Cannot test read-only directory as root")
|
||||
}
|
||||
|
||||
// Create a temporary directory
|
||||
tempDir, err := os.MkdirTemp("", "cheat-clone-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create a read-only subdirectory
|
||||
readOnlyDir := filepath.Join(tempDir, "readonly")
|
||||
if err := os.Mkdir(readOnlyDir, 0555); err != nil {
|
||||
t.Fatalf("failed to create read-only dir: %v", err)
|
||||
}
|
||||
|
||||
// Attempt to clone to read-only directory
|
||||
targetDir := filepath.Join(readOnlyDir, "cheatsheets")
|
||||
err = Clone(targetDir)
|
||||
|
||||
// Should fail because we can't write to read-only directory
|
||||
if err == nil {
|
||||
t.Error("expected error when cloning to read-only directory, got nil")
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("clone to invalid path", func(t *testing.T) {
|
||||
// Try to clone to a path with null bytes (invalid on most filesystems)
|
||||
err := Clone("/tmp/invalid\x00path")
|
||||
if err == nil {
|
||||
t.Error("expected error with invalid path, got nil")
|
||||
}
|
||||
})
|
||||
}
|
||||
177
internal/repo/gitdir_test.go
Normal file
177
internal/repo/gitdir_test.go
Normal file
@@ -0,0 +1,177 @@
|
||||
package repo
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestGitDir(t *testing.T) {
|
||||
// Create a temporary directory for testing
|
||||
tempDir, err := os.MkdirTemp("", "cheat-test-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp dir: %v", err)
|
||||
}
|
||||
defer os.RemoveAll(tempDir)
|
||||
|
||||
// Create test directory structure
|
||||
testDirs := []string{
|
||||
filepath.Join(tempDir, ".git"),
|
||||
filepath.Join(tempDir, ".git", "objects"),
|
||||
filepath.Join(tempDir, ".git", "refs"),
|
||||
filepath.Join(tempDir, "regular"),
|
||||
filepath.Join(tempDir, "regular", ".git"),
|
||||
filepath.Join(tempDir, "submodule"),
|
||||
}
|
||||
|
||||
for _, dir := range testDirs {
|
||||
if err := os.MkdirAll(dir, 0755); err != nil {
|
||||
t.Fatalf("failed to create dir %s: %v", dir, err)
|
||||
}
|
||||
}
|
||||
|
||||
// Create test files
|
||||
testFiles := map[string]string{
|
||||
filepath.Join(tempDir, ".gitignore"): "*.tmp\n",
|
||||
filepath.Join(tempDir, ".gitattributes"): "* text=auto\n",
|
||||
filepath.Join(tempDir, "submodule", ".git"): "gitdir: ../.git/modules/submodule\n",
|
||||
filepath.Join(tempDir, "regular", "sheet.txt"): "content\n",
|
||||
}
|
||||
|
||||
for file, content := range testFiles {
|
||||
if err := os.WriteFile(file, []byte(content), 0644); err != nil {
|
||||
t.Fatalf("failed to create file %s: %v", file, err)
|
||||
}
|
||||
}
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
path string
|
||||
want bool
|
||||
wantErr bool
|
||||
}{
|
||||
{
|
||||
name: "not in git directory",
|
||||
path: filepath.Join(tempDir, "regular", "sheet.txt"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "in .git directory",
|
||||
path: filepath.Join(tempDir, ".git", "objects", "file"),
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
name: "in .git/refs directory",
|
||||
path: filepath.Join(tempDir, ".git", "refs", "heads", "main"),
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
name: ".gitignore file",
|
||||
path: filepath.Join(tempDir, ".gitignore"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: ".gitattributes file",
|
||||
path: filepath.Join(tempDir, ".gitattributes"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "submodule with .git file",
|
||||
path: filepath.Join(tempDir, "submodule", "sheet.txt"),
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "path with .git in middle",
|
||||
path: filepath.Join(tempDir, "regular", ".git", "sheet.txt"),
|
||||
want: true,
|
||||
},
|
||||
{
|
||||
name: "nonexistent path without .git",
|
||||
path: filepath.Join(tempDir, "nonexistent", "file"),
|
||||
want: false,
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got, err := GitDir(tt.path)
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("GitDir() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if got != tt.want {
|
||||
t.Errorf("GitDir() = %v, want %v", got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestGitDirEdgeCases(t *testing.T) {
|
||||
// Test with paths that have .git but not as a directory separator
|
||||
tests := []struct {
|
||||
name string
|
||||
path string
|
||||
want bool
|
||||
}{
|
||||
{
|
||||
name: "file ending with .git",
|
||||
path: "/tmp/myfile.git",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "directory ending with .git",
|
||||
path: "/tmp/myrepo.git",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: ".github directory",
|
||||
path: "/tmp/.github/workflows",
|
||||
want: false,
|
||||
},
|
||||
{
|
||||
name: "legitimate.git-repo name",
|
||||
path: "/tmp/legitimate.git-repo/file",
|
||||
want: false,
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got, err := GitDir(tt.path)
|
||||
if err != nil {
|
||||
// It's ok if the path doesn't exist for these edge case tests
|
||||
return
|
||||
}
|
||||
if got != tt.want {
|
||||
t.Errorf("GitDir(%q) = %v, want %v", tt.path, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestGitDirPathSeparator(t *testing.T) {
|
||||
// Test that the function correctly uses os.PathSeparator
|
||||
// This is important for cross-platform compatibility
|
||||
|
||||
// Create a path with the wrong separator for the current OS
|
||||
var wrongSep string
|
||||
if os.PathSeparator == '/' {
|
||||
wrongSep = `\`
|
||||
} else {
|
||||
wrongSep = `/`
|
||||
}
|
||||
|
||||
// Path with wrong separator should not be detected as git dir
|
||||
path := fmt.Sprintf("some%spath%s.git%sfile", wrongSep, wrongSep, wrongSep)
|
||||
isGit, err := GitDir(path)
|
||||
|
||||
if err != nil {
|
||||
// Path doesn't exist, which is fine
|
||||
return
|
||||
}
|
||||
|
||||
if isGit {
|
||||
t.Errorf("GitDir() incorrectly detected git dir with wrong path separator")
|
||||
}
|
||||
}
|
||||
@@ -32,3 +32,29 @@ func TestColorize(t *testing.T) {
|
||||
t.Errorf("failed to colorize sheet: want: %s, got: %s", want, s.Text)
|
||||
}
|
||||
}
|
||||
|
||||
// TestColorizeError tests the error handling in Colorize
|
||||
func TestColorizeError(_ *testing.T) {
|
||||
// Create a sheet with content
|
||||
sheet := Sheet{
|
||||
Text: "some text",
|
||||
Syntax: "invalidlexer12345", // Use an invalid lexer that might cause issues
|
||||
}
|
||||
|
||||
// Create a config with invalid formatter/style
|
||||
conf := config.Config{
|
||||
Formatter: "invalidformatter",
|
||||
Style: "invalidstyle",
|
||||
}
|
||||
|
||||
// Store original text
|
||||
originalText := sheet.Text
|
||||
|
||||
// Colorize should not panic even with invalid settings
|
||||
sheet.Colorize(conf)
|
||||
|
||||
// The text might be unchanged if there was an error, or it might be colorized
|
||||
// We're mainly testing that it doesn't panic
|
||||
_ = sheet.Text
|
||||
_ = originalText
|
||||
}
|
||||
|
||||
@@ -39,6 +39,8 @@ func (s *Sheet) Copy(dest string) error {
|
||||
// copy file contents
|
||||
_, err = io.Copy(outfile, infile)
|
||||
if err != nil {
|
||||
// Clean up the partially written file on error
|
||||
os.Remove(dest)
|
||||
return fmt.Errorf(
|
||||
"failed to copy file: infile: %s, outfile: %s, err: %v",
|
||||
s.Path,
|
||||
|
||||
187
internal/sheet/copy_error_test.go
Normal file
187
internal/sheet/copy_error_test.go
Normal file
@@ -0,0 +1,187 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestCopyErrors tests error cases for the Copy method
|
||||
func TestCopyErrors(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
setup func() (*Sheet, string, func())
|
||||
wantErr bool
|
||||
errMsg string
|
||||
}{
|
||||
{
|
||||
name: "source file does not exist",
|
||||
setup: func() (*Sheet, string, func()) {
|
||||
// Create a sheet with non-existent path
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: "/non/existent/file.txt",
|
||||
CheatPath: "test",
|
||||
}
|
||||
dest := filepath.Join(os.TempDir(), "copy-test-dest.txt")
|
||||
cleanup := func() {
|
||||
os.Remove(dest)
|
||||
}
|
||||
return sheet, dest, cleanup
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "failed to open cheatsheet",
|
||||
},
|
||||
{
|
||||
name: "destination directory creation fails",
|
||||
setup: func() (*Sheet, string, func()) {
|
||||
// Create a source file
|
||||
src, err := os.CreateTemp("", "copy-test-src-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
src.WriteString("test content")
|
||||
src.Close()
|
||||
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: src.Name(),
|
||||
CheatPath: "test",
|
||||
}
|
||||
|
||||
// Create a file where we want a directory
|
||||
blockerFile := filepath.Join(os.TempDir(), "copy-blocker-file")
|
||||
if err := os.WriteFile(blockerFile, []byte("blocker"), 0644); err != nil {
|
||||
t.Fatalf("failed to create blocker file: %v", err)
|
||||
}
|
||||
|
||||
// Try to create dest under the blocker file (will fail)
|
||||
dest := filepath.Join(blockerFile, "subdir", "dest.txt")
|
||||
|
||||
cleanup := func() {
|
||||
os.Remove(src.Name())
|
||||
os.Remove(blockerFile)
|
||||
}
|
||||
return sheet, dest, cleanup
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "failed to create directory",
|
||||
},
|
||||
{
|
||||
name: "destination file creation fails",
|
||||
setup: func() (*Sheet, string, func()) {
|
||||
// Create a source file
|
||||
src, err := os.CreateTemp("", "copy-test-src-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
src.WriteString("test content")
|
||||
src.Close()
|
||||
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: src.Name(),
|
||||
CheatPath: "test",
|
||||
}
|
||||
|
||||
// Create a directory where we want the file
|
||||
destDir := filepath.Join(os.TempDir(), "copy-test-dir")
|
||||
if err := os.Mkdir(destDir, 0755); err != nil && !os.IsExist(err) {
|
||||
t.Fatalf("failed to create dest dir: %v", err)
|
||||
}
|
||||
|
||||
cleanup := func() {
|
||||
os.Remove(src.Name())
|
||||
os.RemoveAll(destDir)
|
||||
}
|
||||
return sheet, destDir, cleanup
|
||||
},
|
||||
wantErr: true,
|
||||
errMsg: "failed to create outfile",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
sheet, dest, cleanup := tt.setup()
|
||||
defer cleanup()
|
||||
|
||||
err := sheet.Copy(dest)
|
||||
if (err != nil) != tt.wantErr {
|
||||
t.Errorf("Copy() error = %v, wantErr %v", err, tt.wantErr)
|
||||
return
|
||||
}
|
||||
if err != nil && tt.errMsg != "" {
|
||||
if !contains(err.Error(), tt.errMsg) {
|
||||
t.Errorf("Copy() error = %v, want error containing %q", err, tt.errMsg)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestCopyIOError tests the io.Copy error case
|
||||
func TestCopyIOError(t *testing.T) {
|
||||
// This is difficult to test without mocking io.Copy
|
||||
// The error case would occur if the source file is modified
|
||||
// or removed after opening but before copying
|
||||
t.Skip("Skipping io.Copy error test - requires file system race condition")
|
||||
}
|
||||
|
||||
// TestCopyCleanupOnError verifies that partially written files are cleaned up on error
|
||||
func TestCopyCleanupOnError(t *testing.T) {
|
||||
// Create a source file that we'll make unreadable after opening
|
||||
src, err := os.CreateTemp("", "copy-test-cleanup-*")
|
||||
if err != nil {
|
||||
t.Fatalf("failed to create temp file: %v", err)
|
||||
}
|
||||
defer os.Remove(src.Name())
|
||||
|
||||
// Write some content
|
||||
content := "test content for cleanup"
|
||||
if _, err := src.WriteString(content); err != nil {
|
||||
t.Fatalf("failed to write content: %v", err)
|
||||
}
|
||||
src.Close()
|
||||
|
||||
sheet := &Sheet{
|
||||
Title: "test",
|
||||
Path: src.Name(),
|
||||
CheatPath: "test",
|
||||
}
|
||||
|
||||
// Destination path
|
||||
dest := filepath.Join(os.TempDir(), "copy-cleanup-test.txt")
|
||||
defer os.Remove(dest) // Clean up if test fails
|
||||
|
||||
// Make the source file unreadable (simulating a read error during copy)
|
||||
// This is platform-specific, but should work on Unix-like systems
|
||||
if err := os.Chmod(src.Name(), 0000); err != nil {
|
||||
t.Skip("Cannot change file permissions on this platform")
|
||||
}
|
||||
defer os.Chmod(src.Name(), 0644) // Restore permissions for cleanup
|
||||
|
||||
// Attempt to copy - this should fail during io.Copy
|
||||
err = sheet.Copy(dest)
|
||||
if err == nil {
|
||||
t.Error("Expected Copy to fail with permission error")
|
||||
}
|
||||
|
||||
// Verify the destination file was cleaned up
|
||||
if _, err := os.Stat(dest); !os.IsNotExist(err) {
|
||||
t.Error("Destination file should have been removed after copy failure")
|
||||
}
|
||||
}
|
||||
|
||||
func contains(s, substr string) bool {
|
||||
return len(s) >= len(substr) && (s == substr || len(s) > 0 && containsHelper(s, substr))
|
||||
}
|
||||
|
||||
func containsHelper(s, substr string) bool {
|
||||
for i := 0; i <= len(s)-len(substr); i++ {
|
||||
if s[i:i+len(substr)] == substr {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
65
internal/sheet/doc.go
Normal file
65
internal/sheet/doc.go
Normal file
@@ -0,0 +1,65 @@
|
||||
// Package sheet provides functionality for parsing and managing individual cheat sheets.
|
||||
//
|
||||
// A sheet represents a single cheatsheet file containing helpful commands, notes,
|
||||
// or documentation. Sheets can include optional YAML frontmatter for metadata
|
||||
// such as tags and syntax highlighting preferences.
|
||||
//
|
||||
// # Sheet Format
|
||||
//
|
||||
// Sheets are plain text files that may begin with YAML frontmatter:
|
||||
//
|
||||
// ---
|
||||
// syntax: bash
|
||||
// tags: [networking, linux, ssh]
|
||||
// ---
|
||||
// # Connect to remote server
|
||||
// ssh user@hostname
|
||||
//
|
||||
// # Copy files over SSH
|
||||
// scp local_file user@hostname:/remote/path
|
||||
//
|
||||
// The frontmatter is optional. If omitted, the sheet will use default values.
|
||||
//
|
||||
// # Core Types
|
||||
//
|
||||
// The Sheet type contains:
|
||||
// - Title: The sheet's name (derived from filename)
|
||||
// - Path: Full filesystem path to the sheet
|
||||
// - Text: The content of the sheet (without frontmatter)
|
||||
// - Tags: Categories assigned to the sheet
|
||||
// - Syntax: Language hint for syntax highlighting
|
||||
// - ReadOnly: Whether the sheet can be modified
|
||||
//
|
||||
// Key Functions
|
||||
//
|
||||
// - New: Creates a new Sheet from a file path
|
||||
// - Parse: Extracts frontmatter and content from sheet text
|
||||
// - Search: Searches sheet content using regular expressions
|
||||
// - Colorize: Applies syntax highlighting to sheet content
|
||||
//
|
||||
// # Syntax Highlighting
|
||||
//
|
||||
// The package integrates with the Chroma library to provide syntax highlighting.
|
||||
// Supported languages include bash, python, go, javascript, and many others.
|
||||
// The syntax can be specified in the frontmatter or auto-detected.
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Load a sheet from disk
|
||||
// s, err := sheet.New("/path/to/sheet", []string{"personal"}, false)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Search for content
|
||||
// matches, err := s.Search("ssh", false)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Apply syntax highlighting
|
||||
// colorized, err := s.Colorize(config)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
package sheet
|
||||
54
internal/sheet/parse_extended_test.go
Normal file
54
internal/sheet/parse_extended_test.go
Normal file
@@ -0,0 +1,54 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"runtime"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// TestParseWindowsLineEndings tests parsing with Windows line endings
|
||||
func TestParseWindowsLineEndings(t *testing.T) {
|
||||
// Only test Windows line endings on Windows
|
||||
if runtime.GOOS != "windows" {
|
||||
t.Skip("Skipping Windows line ending test on non-Windows platform")
|
||||
}
|
||||
|
||||
// stub our cheatsheet content with Windows line endings
|
||||
markdown := "---\r\nsyntax: go\r\ntags: [ test ]\r\n---\r\nTo foo the bar: baz"
|
||||
|
||||
// parse the frontmatter
|
||||
fm, text, err := parse(markdown)
|
||||
|
||||
// assert expectations
|
||||
if err != nil {
|
||||
t.Errorf("failed to parse markdown: %v", err)
|
||||
}
|
||||
|
||||
want := "To foo the bar: baz"
|
||||
if text != want {
|
||||
t.Errorf("failed to parse text: want: %s, got: %s", want, text)
|
||||
}
|
||||
|
||||
want = "go"
|
||||
if fm.Syntax != want {
|
||||
t.Errorf("failed to parse syntax: want: %s, got: %s", want, fm.Syntax)
|
||||
}
|
||||
}
|
||||
|
||||
// TestParseInvalidYAML tests parsing with invalid YAML in frontmatter
|
||||
func TestParseInvalidYAML(t *testing.T) {
|
||||
// stub our cheatsheet content with invalid YAML
|
||||
markdown := `---
|
||||
syntax: go
|
||||
tags: [ test
|
||||
unclosed bracket
|
||||
---
|
||||
To foo the bar: baz`
|
||||
|
||||
// parse the frontmatter
|
||||
_, _, err := parse(markdown)
|
||||
|
||||
// assert that an error was returned for invalid YAML
|
||||
if err == nil {
|
||||
t.Error("expected error for invalid YAML, got nil")
|
||||
}
|
||||
}
|
||||
132
internal/sheet/parse_fuzz_test.go
Normal file
132
internal/sheet/parse_fuzz_test.go
Normal file
@@ -0,0 +1,132 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// FuzzParse tests the parse function with fuzzing to uncover edge cases
|
||||
// and potential panics in YAML frontmatter parsing
|
||||
func FuzzParse(f *testing.F) {
|
||||
// Add seed corpus with various valid and edge case inputs
|
||||
// Valid frontmatter
|
||||
f.Add("---\nsyntax: go\n---\nContent")
|
||||
f.Add("---\ntags: [a, b]\n---\n")
|
||||
f.Add("---\nsyntax: bash\ntags: [linux, shell]\n---\n#!/bin/bash\necho hello")
|
||||
|
||||
// No frontmatter
|
||||
f.Add("No frontmatter here")
|
||||
f.Add("")
|
||||
f.Add("Just plain text\nwith multiple lines")
|
||||
|
||||
// Edge cases with delimiters
|
||||
f.Add("---")
|
||||
f.Add("---\n")
|
||||
f.Add("---\n---")
|
||||
f.Add("---\n---\n")
|
||||
f.Add("---\n---\n---")
|
||||
f.Add("---\n---\n---\n---")
|
||||
f.Add("------\n------")
|
||||
|
||||
// Invalid YAML
|
||||
f.Add("---\n{invalid yaml\n---\n")
|
||||
f.Add("---\nsyntax: \"unclosed quote\n---\n")
|
||||
f.Add("---\ntags: [a, b,\n---\n")
|
||||
|
||||
// Windows line endings
|
||||
f.Add("---\r\nsyntax: go\r\n---\r\nContent")
|
||||
f.Add("---\r\n---\r\n")
|
||||
|
||||
// Mixed line endings
|
||||
f.Add("---\nsyntax: go\r\n---\nContent")
|
||||
f.Add("---\r\nsyntax: go\n---\r\nContent")
|
||||
|
||||
// Unicode and special characters
|
||||
f.Add("---\ntags: [emoji, 🎉]\n---\n")
|
||||
f.Add("---\nsyntax: 中文\n---\n")
|
||||
f.Add("---\ntags: [\x00, \x01]\n---\n")
|
||||
|
||||
// Very long inputs
|
||||
f.Add("---\ntags: [" + strings.Repeat("a,", 1000) + "a]\n---\n")
|
||||
f.Add("---\n" + strings.Repeat("field: value\n", 1000) + "---\n")
|
||||
|
||||
// Nested structures
|
||||
f.Add("---\ntags:\n - nested\n - list\n---\n")
|
||||
f.Add("---\nmeta:\n author: test\n version: 1.0\n---\n")
|
||||
|
||||
f.Fuzz(func(t *testing.T, input string) {
|
||||
// The parse function should never panic, regardless of input
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("parse panicked with input %q: %v", input, r)
|
||||
}
|
||||
}()
|
||||
|
||||
fm, text, err := parse(input)
|
||||
|
||||
// Verify invariants
|
||||
if err == nil {
|
||||
// If parsing succeeded, validate the result
|
||||
|
||||
// The returned text should be a suffix of the input
|
||||
// (either the whole input if no frontmatter, or the part after frontmatter)
|
||||
if !strings.HasSuffix(input, text) && text != input {
|
||||
t.Errorf("returned text %q is not a valid suffix of input %q", text, input)
|
||||
}
|
||||
|
||||
// If input starts with delimiter and has valid frontmatter,
|
||||
// text should be shorter than input
|
||||
if strings.HasPrefix(input, "---\n") || strings.HasPrefix(input, "---\r\n") {
|
||||
if len(fm.Tags) > 0 || fm.Syntax != "" {
|
||||
// We successfully parsed frontmatter, so text should be shorter
|
||||
if len(text) >= len(input) {
|
||||
t.Errorf("text length %d should be less than input length %d when frontmatter is parsed",
|
||||
len(text), len(input))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Note: Tags can be nil when frontmatter is not present or empty
|
||||
// This is expected behavior in Go for uninitialized slices
|
||||
} else {
|
||||
// If parsing failed, the original input should be returned as text
|
||||
if text != input {
|
||||
t.Errorf("on error, text should equal input: got %q, want %q", text, input)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzParseDelimiterHandling specifically tests delimiter edge cases
|
||||
func FuzzParseDelimiterHandling(f *testing.F) {
|
||||
// Seed corpus focusing on delimiter variations
|
||||
f.Add("---", "content")
|
||||
f.Add("", "---")
|
||||
f.Add("---", "---")
|
||||
f.Add("", "")
|
||||
|
||||
f.Fuzz(func(t *testing.T, prefix string, suffix string) {
|
||||
// Build input with controllable parts around delimiters
|
||||
inputs := []string{
|
||||
prefix + "---\n" + suffix,
|
||||
prefix + "---\r\n" + suffix,
|
||||
prefix + "---\n---\n" + suffix,
|
||||
prefix + "---\r\n---\r\n" + suffix,
|
||||
prefix + "---\n" + "yaml: data\n" + "---\n" + suffix,
|
||||
}
|
||||
|
||||
for _, input := range inputs {
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("parse panicked with constructed input: %v", r)
|
||||
}
|
||||
}()
|
||||
|
||||
_, _, _ = parse(input)
|
||||
}()
|
||||
}
|
||||
})
|
||||
}
|
||||
@@ -9,16 +9,17 @@ import (
|
||||
func (s *Sheet) Search(reg *regexp.Regexp) string {
|
||||
|
||||
// record matches
|
||||
matches := ""
|
||||
var matches []string
|
||||
|
||||
// search through the cheatsheet's text line by line
|
||||
for _, line := range strings.Split(s.Text, "\n\n") {
|
||||
|
||||
// exit early if the line doesn't match the regex
|
||||
// save matching lines
|
||||
if reg.MatchString(line) {
|
||||
matches += line + "\n\n"
|
||||
matches = append(matches, line)
|
||||
}
|
||||
}
|
||||
|
||||
return strings.TrimSpace(matches)
|
||||
// Join matches with the same delimiter used for splitting
|
||||
return strings.Join(matches, "\n\n")
|
||||
}
|
||||
|
||||
190
internal/sheet/search_fuzz_test.go
Normal file
190
internal/sheet/search_fuzz_test.go
Normal file
@@ -0,0 +1,190 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"regexp"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
)
|
||||
|
||||
// FuzzSearchRegex tests the regex compilation and search functionality
|
||||
// to ensure it handles malformed patterns gracefully and doesn't suffer
|
||||
// from catastrophic backtracking
|
||||
func FuzzSearchRegex(f *testing.F) {
|
||||
// Add seed corpus with various regex patterns
|
||||
// Valid patterns
|
||||
f.Add("test", "This is a test string")
|
||||
f.Add("(?i)test", "This is a TEST string")
|
||||
f.Add("foo|bar", "foo and bar")
|
||||
f.Add("^start", "start of line\nnext line")
|
||||
f.Add("end$", "at the end\nnext line")
|
||||
f.Add("\\d+", "123 numbers 456")
|
||||
f.Add("[a-z]+", "lowercase UPPERCASE")
|
||||
|
||||
// Edge cases and potentially problematic patterns
|
||||
f.Add("", "empty pattern")
|
||||
f.Add(".", "any character")
|
||||
f.Add(".*", "match everything")
|
||||
f.Add(".+", "match something")
|
||||
f.Add("\\", "backslash")
|
||||
f.Add("(", "unclosed paren")
|
||||
f.Add(")", "unmatched paren")
|
||||
f.Add("[", "unclosed bracket")
|
||||
f.Add("]", "unmatched bracket")
|
||||
f.Add("[^]", "negated empty class")
|
||||
f.Add("(?", "incomplete group")
|
||||
|
||||
// Patterns that might cause performance issues
|
||||
f.Add("(a+)+", "aaaaaaaaaaaaaaaaaaaaaaaab")
|
||||
f.Add("(a*)*", "aaaaaaaaaaaaaaaaaaaaaaaab")
|
||||
f.Add("(a|a)*", "aaaaaaaaaaaaaaaaaaaaaaaab")
|
||||
f.Add("(.*)*", "any text here")
|
||||
f.Add("(\\d+)+", "123456789012345678901234567890x")
|
||||
|
||||
// Unicode patterns
|
||||
f.Add("☺", "Unicode ☺ smiley")
|
||||
f.Add("[一-龯]", "Chinese 中文 characters")
|
||||
f.Add("\\p{L}+", "Unicode letters")
|
||||
|
||||
// Very long patterns
|
||||
f.Add(strings.Repeat("a", 1000), "long pattern")
|
||||
f.Add(strings.Repeat("(a|b)", 100), "complex pattern")
|
||||
|
||||
f.Fuzz(func(t *testing.T, pattern string, text string) {
|
||||
// Test 1: Regex compilation should not panic
|
||||
var reg *regexp.Regexp
|
||||
var compileErr error
|
||||
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("regexp.Compile panicked with pattern %q: %v", pattern, r)
|
||||
}
|
||||
}()
|
||||
|
||||
reg, compileErr = regexp.Compile(pattern)
|
||||
}()
|
||||
|
||||
// If compilation failed, that's OK - we're testing error handling
|
||||
if compileErr != nil {
|
||||
// This is expected for invalid patterns
|
||||
return
|
||||
}
|
||||
|
||||
// Test 2: Create a sheet and test Search method
|
||||
sheet := Sheet{
|
||||
Title: "test",
|
||||
Text: text,
|
||||
}
|
||||
|
||||
// Search should not panic
|
||||
var result string
|
||||
done := make(chan bool, 1)
|
||||
|
||||
go func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Search panicked with pattern %q on text %q: %v", pattern, text, r)
|
||||
}
|
||||
done <- true
|
||||
}()
|
||||
|
||||
result = sheet.Search(reg)
|
||||
}()
|
||||
|
||||
// Timeout after 100ms to catch catastrophic backtracking
|
||||
select {
|
||||
case <-done:
|
||||
// Search completed successfully
|
||||
case <-time.After(100 * time.Millisecond):
|
||||
t.Errorf("Search timed out (possible catastrophic backtracking) with pattern %q on text %q", pattern, text)
|
||||
}
|
||||
|
||||
// Test 3: Verify search result invariants
|
||||
if result != "" {
|
||||
// The Search function splits by "\n\n", so we need to compare using the same logic
|
||||
resultLines := strings.Split(result, "\n\n")
|
||||
textLines := strings.Split(text, "\n\n")
|
||||
|
||||
// Every result line should exist in the original text lines
|
||||
for _, rLine := range resultLines {
|
||||
found := false
|
||||
for _, tLine := range textLines {
|
||||
if rLine == tLine {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found && rLine != "" {
|
||||
t.Errorf("Search result contains line not in original text: %q", rLine)
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzSearchCatastrophicBacktracking specifically tests for regex patterns
|
||||
// that could cause performance issues
|
||||
func FuzzSearchCatastrophicBacktracking(f *testing.F) {
|
||||
// Seed with patterns known to potentially cause issues
|
||||
f.Add("a", 10, 5)
|
||||
f.Add("x", 20, 3)
|
||||
|
||||
f.Fuzz(func(t *testing.T, char string, repeats int, groups int) {
|
||||
// Limit the size to avoid memory issues in the test
|
||||
if repeats > 30 || repeats < 0 || groups > 10 || groups < 0 || len(char) > 5 {
|
||||
t.Skip("Skipping invalid or overly large test case")
|
||||
}
|
||||
|
||||
// Construct patterns that might cause backtracking
|
||||
patterns := []string{
|
||||
strings.Repeat(char, repeats),
|
||||
"(" + char + "+)+",
|
||||
"(" + char + "*)*",
|
||||
"(" + char + "|" + char + ")*",
|
||||
}
|
||||
|
||||
// Add nested groups
|
||||
if groups > 0 && groups < 10 {
|
||||
nested := char
|
||||
for i := 0; i < groups; i++ {
|
||||
nested = "(" + nested + ")+"
|
||||
}
|
||||
patterns = append(patterns, nested)
|
||||
}
|
||||
|
||||
// Test text that might trigger backtracking
|
||||
testText := strings.Repeat(char, repeats) + "x"
|
||||
|
||||
for _, pattern := range patterns {
|
||||
// Try to compile the pattern
|
||||
reg, err := regexp.Compile(pattern)
|
||||
if err != nil {
|
||||
// Invalid pattern, skip
|
||||
continue
|
||||
}
|
||||
|
||||
// Test with timeout
|
||||
done := make(chan bool, 1)
|
||||
|
||||
go func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Search panicked with backtracking pattern %q: %v", pattern, r)
|
||||
}
|
||||
done <- true
|
||||
}()
|
||||
|
||||
sheet := Sheet{Text: testText}
|
||||
_ = sheet.Search(reg)
|
||||
}()
|
||||
|
||||
select {
|
||||
case <-done:
|
||||
// Completed successfully
|
||||
case <-time.After(50 * time.Millisecond):
|
||||
t.Logf("Warning: potential backtracking issue with pattern %q (completed slowly)", pattern)
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
94
internal/sheet/tagged_fuzz_test.go
Normal file
94
internal/sheet/tagged_fuzz_test.go
Normal file
@@ -0,0 +1,94 @@
|
||||
package sheet
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
// FuzzTagged tests the Tagged function with potentially malicious tag inputs
|
||||
//
|
||||
// Threat model: An attacker crafts a malicious cheatsheet with specially
|
||||
// crafted tags that could cause issues when a user searches/filters by tags.
|
||||
// This is particularly relevant for shared community cheatsheets.
|
||||
func FuzzTagged(f *testing.F) {
|
||||
// Add seed corpus with potentially problematic inputs
|
||||
// These represent tags an attacker might use in a malicious cheatsheet
|
||||
f.Add("normal", "normal")
|
||||
f.Add("", "")
|
||||
f.Add(" ", " ")
|
||||
f.Add("\n", "\n")
|
||||
f.Add("\r\n", "\r\n")
|
||||
f.Add("\x00", "\x00") // Null byte
|
||||
f.Add("../../etc/passwd", "../../etc/passwd") // Path traversal attempt
|
||||
f.Add("'; DROP TABLE sheets;--", "sql") // SQL injection attempt
|
||||
f.Add("<script>alert('xss')</script>", "xss") // XSS attempt
|
||||
f.Add("${HOME}", "${HOME}") // Environment variable
|
||||
f.Add("$(whoami)", "$(whoami)") // Command substitution
|
||||
f.Add("`date`", "`date`") // Command substitution
|
||||
f.Add("\\x41\\x42", "\\x41\\x42") // Escape sequences
|
||||
f.Add("%00", "%00") // URL encoded null
|
||||
f.Add("tag\nwith\nnewlines", "tag")
|
||||
f.Add(strings.Repeat("a", 10000), "a") // Very long tag
|
||||
f.Add("🎉", "🎉") // Unicode
|
||||
f.Add("\U0001F4A9", "\U0001F4A9") // Unicode poop emoji
|
||||
f.Add("tag with spaces", "tag with spaces")
|
||||
f.Add("TAG", "tag") // Case sensitivity check
|
||||
f.Add("tag", "TAG") // Case sensitivity check
|
||||
|
||||
f.Fuzz(func(t *testing.T, sheetTag string, searchTag string) {
|
||||
// Create a sheet with the potentially malicious tag
|
||||
sheet := Sheet{
|
||||
Title: "test",
|
||||
Tags: []string{sheetTag},
|
||||
}
|
||||
|
||||
// The Tagged function should never panic regardless of input
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tagged panicked with sheetTag=%q, searchTag=%q: %v",
|
||||
sheetTag, searchTag, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := sheet.Tagged(searchTag)
|
||||
|
||||
// Verify the result is consistent with a simple string comparison
|
||||
expected := false
|
||||
for _, tag := range sheet.Tags {
|
||||
if tag == searchTag {
|
||||
expected = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if result != expected {
|
||||
t.Errorf("Tagged returned %v but expected %v for sheetTag=%q, searchTag=%q",
|
||||
result, expected, sheetTag, searchTag)
|
||||
}
|
||||
|
||||
// Additional invariant: Tagged should be case-sensitive
|
||||
if sheetTag != searchTag && result {
|
||||
t.Errorf("Tagged matched different strings: sheetTag=%q, searchTag=%q",
|
||||
sheetTag, searchTag)
|
||||
}
|
||||
}()
|
||||
|
||||
// Test with multiple tags including the fuzzed one
|
||||
sheetMulti := Sheet{
|
||||
Title: "test",
|
||||
Tags: []string{"safe1", sheetTag, "safe2", sheetTag}, // Duplicate tags
|
||||
}
|
||||
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tagged panicked with multiple tags including %q: %v",
|
||||
sheetTag, r)
|
||||
}
|
||||
}()
|
||||
|
||||
_ = sheetMulti.Tagged(searchTag)
|
||||
}()
|
||||
})
|
||||
}
|
||||
4
internal/sheet/testdata/fuzz/FuzzSearchCatastrophicBacktracking/3ad1cae1b78a2478
vendored
Normal file
4
internal/sheet/testdata/fuzz/FuzzSearchCatastrophicBacktracking/3ad1cae1b78a2478
vendored
Normal file
@@ -0,0 +1,4 @@
|
||||
go test fuzz v1
|
||||
string("0")
|
||||
int(-6)
|
||||
int(5)
|
||||
3
internal/sheet/testdata/fuzz/FuzzSearchRegex/74c0a5e8e3464bfd
vendored
Normal file
3
internal/sheet/testdata/fuzz/FuzzSearchRegex/74c0a5e8e3464bfd
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
go test fuzz v1
|
||||
string(".")
|
||||
string(" 0000\n\n\n\n00000")
|
||||
65
internal/sheets/doc.go
Normal file
65
internal/sheets/doc.go
Normal file
@@ -0,0 +1,65 @@
|
||||
// Package sheets manages collections of cheat sheets across multiple cheatpaths.
|
||||
//
|
||||
// The sheets package provides functionality to:
|
||||
// - Load sheets from multiple cheatpaths
|
||||
// - Consolidate duplicate sheets (with precedence rules)
|
||||
// - Filter sheets by tags
|
||||
// - Sort sheets alphabetically
|
||||
// - Extract unique tags across all sheets
|
||||
//
|
||||
// # Loading Sheets
|
||||
//
|
||||
// Sheets are loaded recursively from cheatpath directories, excluding:
|
||||
// - Hidden files (starting with .)
|
||||
// - Files in .git directories
|
||||
// - Files with extensions (sheets have no extension)
|
||||
//
|
||||
// # Consolidation
|
||||
//
|
||||
// When multiple cheatpaths contain sheets with the same name, consolidation
|
||||
// rules apply based on the order of cheatpaths. Sheets from earlier paths
|
||||
// override those from later paths, allowing personal sheets to override
|
||||
// community sheets.
|
||||
//
|
||||
// Example:
|
||||
//
|
||||
// cheatpaths:
|
||||
// 1. personal: ~/cheat
|
||||
// 2. community: ~/cheat/community
|
||||
//
|
||||
// If both contain "git", the version from "personal" is used.
|
||||
//
|
||||
// # Filtering
|
||||
//
|
||||
// Sheets can be filtered by:
|
||||
// - Tags: Include only sheets with specific tags
|
||||
// - Cheatpath: Include only sheets from specific paths
|
||||
//
|
||||
// Key Functions
|
||||
//
|
||||
// - Load: Loads all sheets from the given cheatpaths
|
||||
// - Filter: Filters sheets by tag
|
||||
// - Consolidate: Merges sheets from multiple paths with precedence
|
||||
// - Sort: Sorts sheets alphabetically by title
|
||||
// - Tags: Extracts all unique tags from sheets
|
||||
//
|
||||
// Example Usage
|
||||
//
|
||||
// // Load sheets from all cheatpaths
|
||||
// allSheets, err := sheets.Load(cheatpaths)
|
||||
// if err != nil {
|
||||
// log.Fatal(err)
|
||||
// }
|
||||
//
|
||||
// // Consolidate to handle duplicates
|
||||
// consolidated := sheets.Consolidate(allSheets)
|
||||
//
|
||||
// // Filter by tag
|
||||
// filtered := sheets.Filter(consolidated, "networking")
|
||||
//
|
||||
// // Sort alphabetically
|
||||
// sheets.Sort(filtered)
|
||||
//
|
||||
// // Get all unique tags
|
||||
// tags := sheets.Tags(consolidated)
|
||||
package sheets
|
||||
@@ -2,6 +2,7 @@ package sheets
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"unicode/utf8"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
@@ -31,7 +32,8 @@ func Filter(
|
||||
// iterate over each tag. If the sheet does not match *all* tags, filter
|
||||
// it out.
|
||||
for _, tag := range tags {
|
||||
if !sheet.Tagged(strings.TrimSpace(tag)) {
|
||||
trimmed := strings.TrimSpace(tag)
|
||||
if trimmed == "" || !utf8.ValidString(trimmed) || !sheet.Tagged(trimmed) {
|
||||
keep = false
|
||||
}
|
||||
}
|
||||
|
||||
177
internal/sheets/filter_fuzz_test.go
Normal file
177
internal/sheets/filter_fuzz_test.go
Normal file
@@ -0,0 +1,177 @@
|
||||
package sheets
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
|
||||
// FuzzFilter tests the Filter function with various tag combinations
|
||||
func FuzzFilter(f *testing.F) {
|
||||
// Add seed corpus with various tag scenarios
|
||||
// Format: "tags to filter by" (comma-separated)
|
||||
f.Add("linux")
|
||||
f.Add("linux,bash")
|
||||
f.Add("linux,bash,ssh")
|
||||
f.Add("")
|
||||
f.Add(" ")
|
||||
f.Add(" linux ")
|
||||
f.Add("linux,")
|
||||
f.Add(",linux")
|
||||
f.Add(",,")
|
||||
f.Add("linux,,bash")
|
||||
f.Add("tag-with-dash")
|
||||
f.Add("tag_with_underscore")
|
||||
f.Add("UPPERCASE")
|
||||
f.Add("miXedCase")
|
||||
f.Add("🎉emoji")
|
||||
f.Add("tag with spaces")
|
||||
f.Add("\ttab\ttag")
|
||||
f.Add("tag\nwith\nnewline")
|
||||
f.Add("very-long-tag-name-that-might-cause-issues-somewhere")
|
||||
f.Add(strings.Repeat("a,", 100) + "a")
|
||||
|
||||
f.Fuzz(func(t *testing.T, tagString string) {
|
||||
// Split the tag string into individual tags
|
||||
var tags []string
|
||||
if tagString != "" {
|
||||
tags = strings.Split(tagString, ",")
|
||||
}
|
||||
|
||||
// Create test data - some sheets with various tags
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
{
|
||||
"sheet1": sheet.Sheet{
|
||||
Title: "sheet1",
|
||||
Tags: []string{"linux", "bash"},
|
||||
},
|
||||
"sheet2": sheet.Sheet{
|
||||
Title: "sheet2",
|
||||
Tags: []string{"linux", "ssh", "networking"},
|
||||
},
|
||||
"sheet3": sheet.Sheet{
|
||||
Title: "sheet3",
|
||||
Tags: []string{"UPPERCASE", "miXedCase"},
|
||||
},
|
||||
},
|
||||
{
|
||||
"sheet4": sheet.Sheet{
|
||||
Title: "sheet4",
|
||||
Tags: []string{"tag with spaces", "🎉emoji"},
|
||||
},
|
||||
"sheet5": sheet.Sheet{
|
||||
Title: "sheet5",
|
||||
Tags: []string{}, // No tags
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
// The function should not panic
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Filter panicked with tags %q: %v", tags, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Filter(cheatpaths, tags)
|
||||
|
||||
// Verify invariants
|
||||
// 1. Result should have same number of cheatpaths
|
||||
if len(result) != len(cheatpaths) {
|
||||
t.Errorf("Filter changed number of cheatpaths: got %d, want %d",
|
||||
len(result), len(cheatpaths))
|
||||
}
|
||||
|
||||
// 2. Each filtered sheet should contain all requested tags
|
||||
for _, filteredPath := range result {
|
||||
for title, sheet := range filteredPath {
|
||||
// Verify this sheet has all the tags we filtered for
|
||||
for _, tag := range tags {
|
||||
trimmedTag := strings.TrimSpace(tag)
|
||||
if trimmedTag == "" {
|
||||
continue // Skip empty tags
|
||||
}
|
||||
if !sheet.Tagged(trimmedTag) {
|
||||
t.Errorf("Sheet %q passed filter but doesn't have tag %q",
|
||||
title, trimmedTag)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Empty tag list should return all sheets
|
||||
if len(tags) == 0 || (len(tags) == 1 && tags[0] == "") {
|
||||
totalOriginal := 0
|
||||
totalFiltered := 0
|
||||
for _, path := range cheatpaths {
|
||||
totalOriginal += len(path)
|
||||
}
|
||||
for _, path := range result {
|
||||
totalFiltered += len(path)
|
||||
}
|
||||
if totalFiltered != totalOriginal {
|
||||
t.Errorf("Empty filter should return all sheets: got %d, want %d",
|
||||
totalFiltered, totalOriginal)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzFilterEdgeCases tests Filter with extreme inputs
|
||||
func FuzzFilterEdgeCases(f *testing.F) {
|
||||
// Seed with number of tags and tag length
|
||||
f.Add(0, 0)
|
||||
f.Add(1, 10)
|
||||
f.Add(10, 10)
|
||||
f.Add(100, 5)
|
||||
f.Add(1000, 3)
|
||||
|
||||
f.Fuzz(func(t *testing.T, numTags int, tagLen int) {
|
||||
// Limit to reasonable values to avoid memory issues
|
||||
if numTags > 1000 || numTags < 0 || tagLen > 100 || tagLen < 0 {
|
||||
t.Skip("Skipping unreasonable test case")
|
||||
}
|
||||
|
||||
// Generate tags
|
||||
tags := make([]string, numTags)
|
||||
for i := 0; i < numTags; i++ {
|
||||
// Create a tag of specified length
|
||||
if tagLen > 0 {
|
||||
tags[i] = strings.Repeat("a", tagLen) + string(rune(i%26+'a'))
|
||||
}
|
||||
}
|
||||
|
||||
// Create a sheet with no tags (should be filtered out)
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
{
|
||||
"test": sheet.Sheet{
|
||||
Title: "test",
|
||||
Tags: []string{},
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
// Should not panic with many tags
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Filter panicked with %d tags of length %d: %v",
|
||||
numTags, tagLen, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Filter(cheatpaths, tags)
|
||||
|
||||
// With non-matching tags, result should be empty
|
||||
if numTags > 0 && tagLen > 0 {
|
||||
if len(result[0]) != 0 {
|
||||
t.Errorf("Expected empty result with non-matching tags, got %d sheets",
|
||||
len(result[0]))
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
@@ -20,7 +20,7 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
sheets := make([]map[string]sheet.Sheet, len(cheatpaths))
|
||||
|
||||
// iterate over each cheatpath
|
||||
for _, cheatpath := range cheatpaths {
|
||||
for i, cheatpath := range cheatpaths {
|
||||
|
||||
// vivify the map of cheatsheets on this specific cheatpath
|
||||
pathsheets := make(map[string]sheet.Sheet)
|
||||
@@ -43,6 +43,19 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
return nil
|
||||
}
|
||||
|
||||
// get the base filename
|
||||
filename := filepath.Base(path)
|
||||
|
||||
// skip hidden files (files that start with a dot)
|
||||
if strings.HasPrefix(filename, ".") {
|
||||
return nil
|
||||
}
|
||||
|
||||
// skip files with extensions (cheatsheets have no extension)
|
||||
if filepath.Ext(filename) != "" {
|
||||
return nil
|
||||
}
|
||||
|
||||
// calculate the cheatsheet's "title" (the phrase with which it may be
|
||||
// accessed. Eg: `cheat tar` - `tar` is the title)
|
||||
title := strings.TrimPrefix(
|
||||
@@ -88,7 +101,7 @@ func Load(cheatpaths []cp.Cheatpath) ([]map[string]sheet.Sheet, error) {
|
||||
|
||||
// store the sheets on this cheatpath alongside the other cheatsheets on
|
||||
// other cheatpaths
|
||||
sheets = append(sheets, pathsheets)
|
||||
sheets[i] = pathsheets
|
||||
}
|
||||
|
||||
// return the cheatsheets, grouped by cheatpath
|
||||
|
||||
@@ -26,19 +26,26 @@ func TestLoad(t *testing.T) {
|
||||
}
|
||||
|
||||
// load cheatsheets
|
||||
sheets, err := Load(cheatpaths)
|
||||
cheatpathSheets, err := Load(cheatpaths)
|
||||
if err != nil {
|
||||
t.Errorf("failed to load cheatsheets: %v", err)
|
||||
}
|
||||
|
||||
// assert that the correct number of sheets loaded
|
||||
// (sheet load details are tested in `sheet_test.go`)
|
||||
totalSheets := 0
|
||||
for _, sheets := range cheatpathSheets {
|
||||
totalSheets += len(sheets)
|
||||
}
|
||||
|
||||
// we expect 4 total sheets (2 from community, 2 from personal)
|
||||
// hidden files and files with extensions are excluded
|
||||
want := 4
|
||||
if len(sheets) != want {
|
||||
if totalSheets != want {
|
||||
t.Errorf(
|
||||
"failed to load correct number of cheatsheets: want: %d, got: %d",
|
||||
want,
|
||||
len(sheets),
|
||||
totalSheets,
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -2,6 +2,7 @@ package sheets
|
||||
|
||||
import (
|
||||
"sort"
|
||||
"unicode/utf8"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
@@ -16,7 +17,10 @@ func Tags(cheatpaths []map[string]sheet.Sheet) []string {
|
||||
for _, path := range cheatpaths {
|
||||
for _, sheet := range path {
|
||||
for _, tag := range sheet.Tags {
|
||||
tags[tag] = true
|
||||
// Skip invalid UTF-8 tags to prevent downstream issues
|
||||
if utf8.ValidString(tag) {
|
||||
tags[tag] = true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
190
internal/sheets/tags_fuzz_test.go
Normal file
190
internal/sheets/tags_fuzz_test.go
Normal file
@@ -0,0 +1,190 @@
|
||||
package sheets
|
||||
|
||||
import (
|
||||
"strings"
|
||||
"testing"
|
||||
"unicode/utf8"
|
||||
|
||||
"github.com/cheat/cheat/internal/sheet"
|
||||
)
|
||||
|
||||
// FuzzTags tests the Tags function with various tag combinations
|
||||
func FuzzTags(f *testing.F) {
|
||||
// Add seed corpus
|
||||
// Format: comma-separated tags that will be distributed across sheets
|
||||
f.Add("linux,bash,ssh")
|
||||
f.Add("")
|
||||
f.Add("single")
|
||||
f.Add("duplicate,duplicate,duplicate")
|
||||
f.Add(" spaces , around , tags ")
|
||||
f.Add("MiXeD,UPPER,lower")
|
||||
f.Add("special-chars,under_score,dot.ted")
|
||||
f.Add("emoji🎉,unicode中文,symbols@#$")
|
||||
f.Add("\ttab,\nnewline,\rcarriage")
|
||||
f.Add(",,,,") // Multiple empty tags
|
||||
f.Add(strings.Repeat("tag,", 100)) // Many tags
|
||||
f.Add("a," + strings.Repeat("very-long-tag-name", 10)) // Long tag names
|
||||
|
||||
f.Fuzz(func(t *testing.T, tagString string) {
|
||||
// Split tags and distribute them across multiple sheets
|
||||
var allTags []string
|
||||
if tagString != "" {
|
||||
allTags = strings.Split(tagString, ",")
|
||||
}
|
||||
|
||||
// Create test cheatpaths with various tag distributions
|
||||
cheatpaths := []map[string]sheet.Sheet{}
|
||||
|
||||
// Distribute tags across 3 paths with overlapping tags
|
||||
for i := 0; i < 3; i++ {
|
||||
path := make(map[string]sheet.Sheet)
|
||||
|
||||
// Each path gets some subset of tags
|
||||
for j, tag := range allTags {
|
||||
if j%3 == i || j%(i+2) == 0 { // Create some overlap
|
||||
sheetName := string(rune('a' + j%26))
|
||||
path[sheetName] = sheet.Sheet{
|
||||
Title: sheetName,
|
||||
Tags: []string{tag},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add a sheet with multiple tags
|
||||
if len(allTags) > 1 {
|
||||
path["multi"] = sheet.Sheet{
|
||||
Title: "multi",
|
||||
Tags: allTags[:len(allTags)/2+1], // First half of tags
|
||||
}
|
||||
}
|
||||
|
||||
cheatpaths = append(cheatpaths, path)
|
||||
}
|
||||
|
||||
// The function should not panic
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tags panicked with input %q: %v", tagString, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Tags(cheatpaths)
|
||||
|
||||
// Verify invariants
|
||||
// 1. Result should be sorted
|
||||
for i := 1; i < len(result); i++ {
|
||||
if result[i-1] >= result[i] {
|
||||
t.Errorf("Tags not sorted: %q >= %q at positions %d, %d",
|
||||
result[i-1], result[i], i-1, i)
|
||||
}
|
||||
}
|
||||
|
||||
// 2. No duplicates in result
|
||||
seen := make(map[string]bool)
|
||||
for _, tag := range result {
|
||||
if seen[tag] {
|
||||
t.Errorf("Duplicate tag in result: %q", tag)
|
||||
}
|
||||
seen[tag] = true
|
||||
}
|
||||
|
||||
// 3. All non-empty tags from input should be in result
|
||||
// (This is approximate since we distributed tags in a complex way)
|
||||
inputTags := make(map[string]bool)
|
||||
for _, tag := range allTags {
|
||||
if tag != "" {
|
||||
inputTags[tag] = true
|
||||
}
|
||||
}
|
||||
|
||||
resultTags := make(map[string]bool)
|
||||
for _, tag := range result {
|
||||
resultTags[tag] = true
|
||||
}
|
||||
|
||||
// Result might have fewer tags due to distribution logic,
|
||||
// but shouldn't have tags not in the input
|
||||
for tag := range resultTags {
|
||||
found := false
|
||||
for inputTag := range inputTags {
|
||||
if tag == inputTag {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found && tag != "" {
|
||||
t.Errorf("Result contains tag %q not derived from input", tag)
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Valid UTF-8 (Tags function should filter out invalid UTF-8)
|
||||
for _, tag := range result {
|
||||
if !utf8.ValidString(tag) {
|
||||
t.Errorf("Invalid UTF-8 in tag: %q", tag)
|
||||
}
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
|
||||
// FuzzTagsStress tests Tags function with large numbers of tags
|
||||
func FuzzTagsStress(f *testing.F) {
|
||||
// Seed: number of unique tags, number of sheets, tags per sheet
|
||||
f.Add(10, 10, 5)
|
||||
f.Add(100, 50, 10)
|
||||
f.Add(1000, 100, 20)
|
||||
|
||||
f.Fuzz(func(t *testing.T, numUniqueTags int, numSheets int, tagsPerSheet int) {
|
||||
// Limit to reasonable values
|
||||
if numUniqueTags > 1000 || numUniqueTags < 0 ||
|
||||
numSheets > 1000 || numSheets < 0 ||
|
||||
tagsPerSheet > 100 || tagsPerSheet < 0 {
|
||||
t.Skip("Skipping unreasonable test case")
|
||||
}
|
||||
|
||||
// Generate unique tags
|
||||
uniqueTags := make([]string, numUniqueTags)
|
||||
for i := 0; i < numUniqueTags; i++ {
|
||||
uniqueTags[i] = "tag" + string(rune(i))
|
||||
}
|
||||
|
||||
// Create sheets with random tags
|
||||
cheatpaths := []map[string]sheet.Sheet{
|
||||
make(map[string]sheet.Sheet),
|
||||
}
|
||||
|
||||
for i := 0; i < numSheets; i++ {
|
||||
// Select random tags for this sheet
|
||||
sheetTags := make([]string, 0, tagsPerSheet)
|
||||
for j := 0; j < tagsPerSheet && j < numUniqueTags; j++ {
|
||||
// Distribute tags across sheets
|
||||
tagIndex := (i*tagsPerSheet + j) % numUniqueTags
|
||||
sheetTags = append(sheetTags, uniqueTags[tagIndex])
|
||||
}
|
||||
|
||||
cheatpaths[0]["sheet"+string(rune(i))] = sheet.Sheet{
|
||||
Title: "sheet" + string(rune(i)),
|
||||
Tags: sheetTags,
|
||||
}
|
||||
}
|
||||
|
||||
// Should handle large numbers efficiently
|
||||
func() {
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
t.Errorf("Tags panicked with %d unique tags, %d sheets, %d tags/sheet: %v",
|
||||
numUniqueTags, numSheets, tagsPerSheet, r)
|
||||
}
|
||||
}()
|
||||
|
||||
result := Tags(cheatpaths)
|
||||
|
||||
// Should have at most numUniqueTags in result
|
||||
if len(result) > numUniqueTags {
|
||||
t.Errorf("More tags in result (%d) than unique tags created (%d)",
|
||||
len(result), numUniqueTags)
|
||||
}
|
||||
}()
|
||||
})
|
||||
}
|
||||
2
internal/sheets/testdata/fuzz/FuzzFilter/4316c263ab833860
vendored
Normal file
2
internal/sheets/testdata/fuzz/FuzzFilter/4316c263ab833860
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
go test fuzz v1
|
||||
string("\xd7")
|
||||
2
internal/sheets/testdata/fuzz/FuzzTags/28f36ef487f23e6c
vendored
Normal file
2
internal/sheets/testdata/fuzz/FuzzTags/28f36ef487f23e6c
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
go test fuzz v1
|
||||
string("\xf0")
|
||||
4
vendor/github.com/alecthomas/chroma/v2/.editorconfig
generated
vendored
4
vendor/github.com/alecthomas/chroma/v2/.editorconfig
generated
vendored
@@ -11,7 +11,3 @@ insert_final_newline = true
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
insert_final_newline = false
|
||||
|
||||
[*.yml]
|
||||
indent_style = space
|
||||
indent_size = 2
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/.golangci.yml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/.golangci.yml
generated
vendored
@@ -49,8 +49,6 @@ linters:
|
||||
- nosnakecase
|
||||
- testableexamples
|
||||
- musttag
|
||||
- depguard
|
||||
- goconst
|
||||
|
||||
linters-settings:
|
||||
govet:
|
||||
|
||||
105
vendor/github.com/alecthomas/chroma/v2/README.md
generated
vendored
105
vendor/github.com/alecthomas/chroma/v2/README.md
generated
vendored
@@ -8,72 +8,75 @@ highlighted HTML, ANSI-coloured text, etc.
|
||||
Chroma is based heavily on [Pygments](http://pygments.org/), and includes
|
||||
translators for Pygments lexers and styles.
|
||||
|
||||
<a id="markdown-table-of-contents" name="table-of-contents"></a>
|
||||
|
||||
## Table of Contents
|
||||
|
||||
<!-- TOC -->
|
||||
|
||||
1. [Supported languages](#supported-languages)
|
||||
2. [Try it](#try-it)
|
||||
3. [Using the library](#using-the-library)
|
||||
1. [Table of Contents](#table-of-contents)
|
||||
2. [Supported languages](#supported-languages)
|
||||
3. [Try it](#try-it)
|
||||
4. [Using the library](#using-the-library)
|
||||
1. [Quick start](#quick-start)
|
||||
2. [Identifying the language](#identifying-the-language)
|
||||
3. [Formatting the output](#formatting-the-output)
|
||||
4. [The HTML formatter](#the-html-formatter)
|
||||
4. [More detail](#more-detail)
|
||||
5. [More detail](#more-detail)
|
||||
1. [Lexers](#lexers)
|
||||
2. [Formatters](#formatters)
|
||||
3. [Styles](#styles)
|
||||
5. [Command-line interface](#command-line-interface)
|
||||
6. [Testing lexers](#testing-lexers)
|
||||
7. [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
|
||||
6. [Command-line interface](#command-line-interface)
|
||||
7. [Testing lexers](#testing-lexers)
|
||||
8. [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
|
||||
|
||||
<!-- /TOC -->
|
||||
|
||||
<a id="markdown-supported-languages" name="supported-languages"></a>
|
||||
|
||||
## Supported languages
|
||||
|
||||
| Prefix | Language |
|
||||
| :----: | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Agda, AL, Alloy, Angular2, ANTLR, ApacheConf, APL, AppleScript, ArangoDB AQL, Arduino, ArmAsm, AutoHotkey, AutoIt, Awk |
|
||||
| B | Ballerina, Bash, Bash Session, Batchfile, BibTeX, Bicep, BlitzBasic, BNF, BQN, Brainfuck |
|
||||
| C | C, C#, C++, Caddyfile, Caddyfile Directives, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Chapel, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython |
|
||||
| D | D, Dart, Dax, Desktop Entry, Diff, Django/Jinja, dns, Docker, DTD, Dylan |
|
||||
| E | EBNF, Elixir, Elm, EmacsLisp, Erlang |
|
||||
| F | Factor, Fennel, Fish, Forth, Fortran, FortranFixed, FSharp |
|
||||
| G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, Gherkin, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groff, Groovy |
|
||||
| H | Handlebars, Hare, Haskell, Haxe, HCL, Hexdump, HLB, HLSL, HolyC, HTML, HTTP, Hy |
|
||||
| I | Idris, Igor, INI, Io, ISCdhcpd |
|
||||
| J | J, Java, JavaScript, JSON, Julia, Jungle |
|
||||
| K | Kotlin |
|
||||
| L | Lighttpd configuration file, LLVM, Lua |
|
||||
| M | Makefile, Mako, markdown, Mason, Materialize SQL dialect, Mathematica, Matlab, mcfunction, Meson, Metal, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL |
|
||||
| N | NASM, Natural, Newspeak, Nginx configuration file, Nim, Nix |
|
||||
| O | Objective-C, OCaml, Octave, Odin, OnesEnterprise, OpenEdge ABL, OpenSCAD, Org Mode |
|
||||
| P | PacmanConf, Perl, PHP, PHTML, Pig, PkgConfig, PL/pgSQL, plaintext, Plutus Core, Pony, PostgreSQL SQL dialect, PostScript, POVRay, PowerQuery, PowerShell, Prolog, PromQL, Promela, properties, Protocol Buffer, PRQL, PSL, Puppet, Python, Python 2 |
|
||||
| Q | QBasic, QML |
|
||||
| R | R, Racket, Ragel, Raku, react, ReasonML, reg, Rego, reStructuredText, Rexx, RPMSpec, Ruby, Rust |
|
||||
| S | SAS, Sass, Scala, Scheme, Scilab, SCSS, Sed, Sieve, Smali, Smalltalk, Smarty, Snobol, Solidity, SourcePawn, SPARQL, SQL, SquidConf, Standard ML, stas, Stylus, Svelte, Swift, SYSTEMD, systemverilog |
|
||||
| T | TableGen, Tal, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData |
|
||||
| V | V, V shell, Vala, VB.net, verilog, VHDL, VHS, VimL, vue |
|
||||
| W | WDTE, WebGPU Shading Language, Whiley |
|
||||
| X | XML, Xorg |
|
||||
| Y | YAML, YANG |
|
||||
| Z | Z80 Assembly, Zed, Zig |
|
||||
| Prefix | Language |
|
||||
| :----: | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
|
||||
| A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Agda, AL, Alloy, Angular2, ANTLR, ApacheConf, APL, AppleScript, ArangoDB AQL, Arduino, ArmAsm, AutoHotkey, AutoIt, Awk |
|
||||
| B | Ballerina, Bash, Bash Session, Batchfile, BibTeX, Bicep, BlitzBasic, BNF, BQN, Brainfuck |
|
||||
| C | C, C#, C++, Caddyfile, Caddyfile Directives, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Chapel, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython |
|
||||
| D | D, Dart, Dax, Diff, Django/Jinja, dns, Docker, DTD, Dylan |
|
||||
| E | EBNF, Elixir, Elm, EmacsLisp, Erlang |
|
||||
| F | Factor, Fennel, Fish, Forth, Fortran, FortranFixed, FSharp |
|
||||
| G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, Gherkin, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groff, Groovy |
|
||||
| H | Handlebars, Hare, Haskell, Haxe, HCL, Hexdump, HLB, HLSL, HolyC, HTML, HTTP, Hy |
|
||||
| I | Idris, Igor, INI, Io, ISCdhcpd |
|
||||
| J | J, Java, JavaScript, JSON, Julia, Jungle |
|
||||
| K | Kotlin |
|
||||
| L | Lighttpd configuration file, LLVM, Lua |
|
||||
| M | Makefile, Mako, markdown, Mason, Mathematica, Matlab, mcfunction, Meson, Metal, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL |
|
||||
| N | NASM, Natural, Newspeak, Nginx configuration file, Nim, Nix |
|
||||
| O | Objective-C, OCaml, Octave, Odin, OnesEnterprise, OpenEdge ABL, OpenSCAD, Org Mode |
|
||||
| P | PacmanConf, Perl, PHP, PHTML, Pig, PkgConfig, PL/pgSQL, plaintext, Plutus Core, Pony, PostgreSQL SQL dialect, PostScript, POVRay, PowerQuery, PowerShell, Prolog, PromQL, properties, Protocol Buffer, PRQL, PSL, Puppet, Python, Python 2 |
|
||||
| Q | QBasic, QML |
|
||||
| R | R, Racket, Ragel, Raku, react, ReasonML, reg, reStructuredText, Rexx, Ruby, Rust |
|
||||
| S | SAS, Sass, Scala, Scheme, Scilab, SCSS, Sed, Sieve, Smali, Smalltalk, Smarty, Snobol, Solidity, SourcePawn, SPARQL, SQL, SquidConf, Standard ML, stas, Stylus, Svelte, Swift, SYSTEMD, systemverilog |
|
||||
| T | TableGen, Tal, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData |
|
||||
| V | V, V shell, Vala, VB.net, verilog, VHDL, VHS, VimL, vue |
|
||||
| W | WDTE, WebGPU Shading Language, Whiley |
|
||||
| X | XML, Xorg |
|
||||
| Y | YAML, YANG |
|
||||
| Z | Z80 Assembly, Zed, Zig |
|
||||
|
||||
_I will attempt to keep this section up to date, but an authoritative list can be
|
||||
displayed with `chroma --list`._
|
||||
|
||||
<a id="markdown-try-it" name="try-it"></a>
|
||||
|
||||
## Try it
|
||||
|
||||
Try out various languages and styles on the [Chroma Playground](https://swapoff.org/chroma/playground/).
|
||||
|
||||
<a id="markdown-using-the-library" name="using-the-library"></a>
|
||||
|
||||
## Using the library
|
||||
|
||||
This is version 2 of Chroma, use the import path:
|
||||
|
||||
```go
|
||||
import "github.com/alecthomas/chroma/v2"
|
||||
```
|
||||
|
||||
Chroma, like Pygments, has the concepts of
|
||||
[lexers](https://github.com/alecthomas/chroma/tree/master/lexers),
|
||||
[formatters](https://github.com/alecthomas/chroma/tree/master/formatters) and
|
||||
@@ -92,6 +95,8 @@ In all cases, if a lexer, formatter or style can not be determined, `nil` will
|
||||
be returned. In this situation you may want to default to the `Fallback`
|
||||
value in each respective package, which provides sane defaults.
|
||||
|
||||
<a id="markdown-quick-start" name="quick-start"></a>
|
||||
|
||||
### Quick start
|
||||
|
||||
A convenience function exists that can be used to simply format some source
|
||||
@@ -101,6 +106,8 @@ text, without any effort:
|
||||
err := quick.Highlight(os.Stdout, someSourceCode, "go", "html", "monokai")
|
||||
```
|
||||
|
||||
<a id="markdown-identifying-the-language" name="identifying-the-language"></a>
|
||||
|
||||
### Identifying the language
|
||||
|
||||
To highlight code, you'll first have to identify what language the code is
|
||||
@@ -140,6 +147,8 @@ token types into a single token:
|
||||
lexer = chroma.Coalesce(lexer)
|
||||
```
|
||||
|
||||
<a id="markdown-formatting-the-output" name="formatting-the-output"></a>
|
||||
|
||||
### Formatting the output
|
||||
|
||||
Once a language is identified you will need to pick a formatter and a style (theme).
|
||||
@@ -168,6 +177,8 @@ And finally, format the tokens from the iterator:
|
||||
err := formatter.Format(w, style, iterator)
|
||||
```
|
||||
|
||||
<a id="markdown-the-html-formatter" name="the-html-formatter"></a>
|
||||
|
||||
### The HTML formatter
|
||||
|
||||
By default the `html` registered formatter generates standalone HTML with
|
||||
@@ -192,8 +203,12 @@ formatter := html.New(html.WithClasses(true))
|
||||
err := formatter.WriteCSS(w, style)
|
||||
```
|
||||
|
||||
<a id="markdown-more-detail" name="more-detail"></a>
|
||||
|
||||
## More detail
|
||||
|
||||
<a id="markdown-lexers" name="lexers"></a>
|
||||
|
||||
### Lexers
|
||||
|
||||
See the [Pygments documentation](http://pygments.org/docs/lexerdevelopment/)
|
||||
@@ -213,6 +228,8 @@ python3 _tools/pygments2chroma_xml.py \
|
||||
See notes in [pygments-lexers.txt](https://github.com/alecthomas/chroma/blob/master/pygments-lexers.txt)
|
||||
for a list of lexers, and notes on some of the issues importing them.
|
||||
|
||||
<a id="markdown-formatters" name="formatters"></a>
|
||||
|
||||
### Formatters
|
||||
|
||||
Chroma supports HTML output, as well as terminal output in 8 colour, 256 colour, and true-colour.
|
||||
@@ -220,6 +237,8 @@ Chroma supports HTML output, as well as terminal output in 8 colour, 256 colour,
|
||||
A `noop` formatter is included that outputs the token text only, and a `tokens`
|
||||
formatter outputs raw tokens. The latter is useful for debugging lexers.
|
||||
|
||||
<a id="markdown-styles" name="styles"></a>
|
||||
|
||||
### Styles
|
||||
|
||||
Chroma styles are defined in XML. The style entries use the
|
||||
@@ -243,6 +262,8 @@ Also, token types in a style file are hierarchical. For instance, when `CommentS
|
||||
|
||||
For a quick overview of the available styles and how they look, check out the [Chroma Style Gallery](https://xyproto.github.io/splash/docs/).
|
||||
|
||||
<a id="markdown-command-line-interface" name="command-line-interface"></a>
|
||||
|
||||
## Command-line interface
|
||||
|
||||
A command-line interface to Chroma is included.
|
||||
@@ -267,6 +288,10 @@ on under the hood for easy integration with [lesspipe shipping with
|
||||
Debian and derivatives](https://manpages.debian.org/lesspipe#USER_DEFINED_FILTERS);
|
||||
for that setup the `chroma` executable can be just symlinked to `~/.lessfilter`.
|
||||
|
||||
<a id="markdown-whats-missing-compared-to-pygments" name="whats-missing-compared-to-pygments"></a>
|
||||
|
||||
<a id="markdown-testing-lexers" name="testing-lexers"></a>
|
||||
|
||||
## Testing lexers
|
||||
|
||||
If you edit some lexers and want to try it, open a shell in `cmd/chromad` and run:
|
||||
|
||||
81
vendor/github.com/alecthomas/chroma/v2/formatters/html/html.go
generated
vendored
81
vendor/github.com/alecthomas/chroma/v2/formatters/html/html.go
generated
vendored
@@ -5,9 +5,7 @@ import (
|
||||
"html"
|
||||
"io"
|
||||
"sort"
|
||||
"strconv"
|
||||
"strings"
|
||||
"sync"
|
||||
|
||||
"github.com/alecthomas/chroma/v2"
|
||||
)
|
||||
@@ -134,7 +132,6 @@ func New(options ...Option) *Formatter {
|
||||
baseLineNumber: 1,
|
||||
preWrapper: defaultPreWrapper,
|
||||
}
|
||||
f.styleCache = newStyleCache(f)
|
||||
for _, option := range options {
|
||||
option(f)
|
||||
}
|
||||
@@ -191,7 +188,6 @@ var (
|
||||
|
||||
// Formatter that generates HTML.
|
||||
type Formatter struct {
|
||||
styleCache *styleCache
|
||||
standalone bool
|
||||
prefix string
|
||||
Classes bool // Exported field to detect when classes are being used
|
||||
@@ -224,7 +220,12 @@ func (f *Formatter) Format(w io.Writer, style *chroma.Style, iterator chroma.Ite
|
||||
//
|
||||
// OTOH we need to be super careful about correct escaping...
|
||||
func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.Token) (err error) { // nolint: gocyclo
|
||||
css := f.styleCache.get(style, true)
|
||||
css := f.styleToCSS(style)
|
||||
if !f.Classes {
|
||||
for t, style := range css {
|
||||
css[t] = compressStyle(style)
|
||||
}
|
||||
}
|
||||
if f.standalone {
|
||||
fmt.Fprint(w, "<html>\n")
|
||||
if f.Classes {
|
||||
@@ -242,7 +243,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
|
||||
wrapInTable := f.lineNumbers && f.lineNumbersInTable
|
||||
|
||||
lines := chroma.SplitTokensIntoLines(tokens)
|
||||
lineDigits := len(strconv.Itoa(f.baseLineNumber + len(lines) - 1))
|
||||
lineDigits := len(fmt.Sprintf("%d", f.baseLineNumber+len(lines)-1))
|
||||
highlightIndex := 0
|
||||
|
||||
if wrapInTable {
|
||||
@@ -250,7 +251,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
|
||||
fmt.Fprintf(w, "<div%s>\n", f.styleAttr(css, chroma.PreWrapper))
|
||||
fmt.Fprintf(w, "<table%s><tr>", f.styleAttr(css, chroma.LineTable))
|
||||
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD))
|
||||
fmt.Fprintf(w, "%s", f.preWrapper.Start(false, f.styleAttr(css, chroma.PreWrapper)))
|
||||
fmt.Fprintf(w, f.preWrapper.Start(false, f.styleAttr(css, chroma.PreWrapper)))
|
||||
for index := range lines {
|
||||
line := f.baseLineNumber + index
|
||||
highlight, next := f.shouldHighlight(highlightIndex, line)
|
||||
@@ -272,7 +273,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
|
||||
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD, "width:100%"))
|
||||
}
|
||||
|
||||
fmt.Fprintf(w, "%s", f.preWrapper.Start(true, f.styleAttr(css, chroma.PreWrapper)))
|
||||
fmt.Fprintf(w, f.preWrapper.Start(true, f.styleAttr(css, chroma.PreWrapper)))
|
||||
|
||||
highlightIndex = 0
|
||||
for index, tokens := range lines {
|
||||
@@ -322,7 +323,7 @@ func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.
|
||||
fmt.Fprint(w, `</span>`) // End of Line
|
||||
}
|
||||
}
|
||||
fmt.Fprintf(w, "%s", f.preWrapper.End(true))
|
||||
fmt.Fprintf(w, f.preWrapper.End(true))
|
||||
|
||||
if wrapInTable {
|
||||
fmt.Fprint(w, "</td></tr></table>\n")
|
||||
@@ -418,7 +419,7 @@ func (f *Formatter) tabWidthStyle() string {
|
||||
|
||||
// WriteCSS writes CSS style definitions (without any surrounding HTML).
|
||||
func (f *Formatter) WriteCSS(w io.Writer, style *chroma.Style) error {
|
||||
css := f.styleCache.get(style, false)
|
||||
css := f.styleToCSS(style)
|
||||
// Special-case background as it is mapped to the outer ".chroma" class.
|
||||
if _, err := fmt.Fprintf(w, "/* %s */ .%sbg { %s }\n", chroma.Background, f.prefix, css[chroma.Background]); err != nil {
|
||||
return err
|
||||
@@ -561,63 +562,3 @@ func compressStyle(s string) string {
|
||||
}
|
||||
return strings.Join(out, ";")
|
||||
}
|
||||
|
||||
const styleCacheLimit = 32
|
||||
|
||||
type styleCacheEntry struct {
|
||||
style *chroma.Style
|
||||
compressed bool
|
||||
cache map[chroma.TokenType]string
|
||||
}
|
||||
|
||||
type styleCache struct {
|
||||
mu sync.Mutex
|
||||
// LRU cache of compiled (and possibly compressed) styles. This is a slice
|
||||
// because the cache size is small, and a slice is sufficiently fast for
|
||||
// small N.
|
||||
cache []styleCacheEntry
|
||||
f *Formatter
|
||||
}
|
||||
|
||||
func newStyleCache(f *Formatter) *styleCache {
|
||||
return &styleCache{f: f}
|
||||
}
|
||||
|
||||
func (l *styleCache) get(style *chroma.Style, compress bool) map[chroma.TokenType]string {
|
||||
l.mu.Lock()
|
||||
defer l.mu.Unlock()
|
||||
|
||||
// Look for an existing entry.
|
||||
for i := len(l.cache) - 1; i >= 0; i-- {
|
||||
entry := l.cache[i]
|
||||
if entry.style == style && entry.compressed == compress {
|
||||
// Top of the cache, no need to adjust the order.
|
||||
if i == len(l.cache)-1 {
|
||||
return entry.cache
|
||||
}
|
||||
// Move this entry to the end of the LRU
|
||||
copy(l.cache[i:], l.cache[i+1:])
|
||||
l.cache[len(l.cache)-1] = entry
|
||||
return entry.cache
|
||||
}
|
||||
}
|
||||
|
||||
// No entry, create one.
|
||||
cached := l.f.styleToCSS(style)
|
||||
if !l.f.Classes {
|
||||
for t, style := range cached {
|
||||
cached[t] = compressStyle(style)
|
||||
}
|
||||
}
|
||||
if compress {
|
||||
for t, style := range cached {
|
||||
cached[t] = compressStyle(style)
|
||||
}
|
||||
}
|
||||
// Evict the oldest entry.
|
||||
if len(l.cache) >= styleCacheLimit {
|
||||
l.cache = l.cache[0:copy(l.cache, l.cache[1:])]
|
||||
}
|
||||
l.cache = append(l.cache, styleCacheEntry{style: style, cache: cached, compressed: compress})
|
||||
return cached
|
||||
}
|
||||
|
||||
180
vendor/github.com/alecthomas/chroma/v2/lexers/caddyfile.go
generated
vendored
180
vendor/github.com/alecthomas/chroma/v2/lexers/caddyfile.go
generated
vendored
@@ -4,82 +4,52 @@ import (
|
||||
. "github.com/alecthomas/chroma/v2" // nolint
|
||||
)
|
||||
|
||||
// Matcher token stub for docs, or
|
||||
// Named matcher: @name, or
|
||||
// Path matcher: /foo, or
|
||||
// Wildcard path matcher: *
|
||||
// nolint: gosec
|
||||
var caddyfileMatcherTokenRegexp = `(\[\<matcher\>\]|@[^\s]+|/[^\s]+|\*)`
|
||||
|
||||
// Comment at start of line, or
|
||||
// Comment preceded by whitespace
|
||||
var caddyfileCommentRegexp = `(^|\s+)#.*\n`
|
||||
|
||||
// caddyfileCommon are the rules common to both of the lexer variants
|
||||
func caddyfileCommonRules() Rules {
|
||||
return Rules{
|
||||
"site_block_common": {
|
||||
Include("site_body"),
|
||||
// Any other directive
|
||||
{`[^\s#]+`, Keyword, Push("directive")},
|
||||
Include("base"),
|
||||
},
|
||||
"site_body": {
|
||||
// Import keyword
|
||||
{`\b(import|invoke)\b( [^\s#]+)`, ByGroups(Keyword, Text), Push("subdirective")},
|
||||
{`(import)(\s+)([^\s]+)`, ByGroups(Keyword, Text, NameVariableMagic), nil},
|
||||
// Matcher definition
|
||||
{`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")},
|
||||
// Matcher token stub for docs
|
||||
{`\[\<matcher\>\]`, NameDecorator, Push("matcher")},
|
||||
// These cannot have matchers but may have things that look like
|
||||
// matchers in their arguments, so we just parse as a subdirective.
|
||||
{`\b(try_files|tls|log|bind)\b`, Keyword, Push("subdirective")},
|
||||
{`try_files`, Keyword, Push("subdirective")},
|
||||
// These are special, they can nest more directives
|
||||
{`\b(handle_errors|handle_path|handle_response|replace_status|handle|route)\b`, Keyword, Push("nested_directive")},
|
||||
// uri directive has special syntax
|
||||
{`\b(uri)\b`, Keyword, Push("uri_directive")},
|
||||
{`handle_errors|handle|route|handle_path|not`, Keyword, Push("nested_directive")},
|
||||
// Any other directive
|
||||
{`[^\s#]+`, Keyword, Push("directive")},
|
||||
Include("base"),
|
||||
},
|
||||
"matcher": {
|
||||
{`\{`, Punctuation, Push("block")},
|
||||
// Not can be one-liner
|
||||
{`not`, Keyword, Push("deep_not_matcher")},
|
||||
// Heredoc for CEL expression
|
||||
Include("heredoc"),
|
||||
// Backtick for CEL expression
|
||||
{"`", StringBacktick, Push("backticks")},
|
||||
// Any other same-line matcher
|
||||
{`[^\s#]+`, Keyword, Push("arguments")},
|
||||
// Terminators
|
||||
{`\s*\n`, Text, Pop(1)},
|
||||
{`\n`, Text, Pop(1)},
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
Include("base"),
|
||||
},
|
||||
"block": {
|
||||
{`\}`, Punctuation, Pop(2)},
|
||||
// Using double quotes doesn't stop at spaces
|
||||
{`"`, StringDouble, Push("double_quotes")},
|
||||
// Using backticks doesn't stop at spaces
|
||||
{"`", StringBacktick, Push("backticks")},
|
||||
// Not can be one-liner
|
||||
{`not`, Keyword, Push("not_matcher")},
|
||||
// Directives & matcher definitions
|
||||
Include("site_body"),
|
||||
// Any directive
|
||||
// Any other subdirective
|
||||
{`[^\s#]+`, Keyword, Push("subdirective")},
|
||||
Include("base"),
|
||||
},
|
||||
"nested_block": {
|
||||
{`\}`, Punctuation, Pop(2)},
|
||||
// Using double quotes doesn't stop at spaces
|
||||
{`"`, StringDouble, Push("double_quotes")},
|
||||
// Using backticks doesn't stop at spaces
|
||||
{"`", StringBacktick, Push("backticks")},
|
||||
// Not can be one-liner
|
||||
{`not`, Keyword, Push("not_matcher")},
|
||||
// Directives & matcher definitions
|
||||
Include("site_body"),
|
||||
// Any other subdirective
|
||||
{`[^\s#]+`, Keyword, Push("directive")},
|
||||
// Matcher definition
|
||||
{`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")},
|
||||
// Something that starts with literally < is probably a docs stub
|
||||
{`\<[^#]+\>`, Keyword, Push("nested_directive")},
|
||||
// Any other directive
|
||||
{`[^\s#]+`, Keyword, Push("nested_directive")},
|
||||
Include("base"),
|
||||
},
|
||||
"not_matcher": {
|
||||
@@ -96,97 +66,69 @@ func caddyfileCommonRules() Rules {
|
||||
},
|
||||
"directive": {
|
||||
{`\{(?=\s)`, Punctuation, Push("block")},
|
||||
{caddyfileMatcherTokenRegexp, NameDecorator, Push("arguments")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(1)},
|
||||
{`\s*\n`, Text, Pop(1)},
|
||||
Include("matcher_token"),
|
||||
Include("comments_pop_1"),
|
||||
{`\n`, Text, Pop(1)},
|
||||
Include("base"),
|
||||
},
|
||||
"nested_directive": {
|
||||
{`\{(?=\s)`, Punctuation, Push("nested_block")},
|
||||
{caddyfileMatcherTokenRegexp, NameDecorator, Push("nested_arguments")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(1)},
|
||||
{`\s*\n`, Text, Pop(1)},
|
||||
Include("matcher_token"),
|
||||
Include("comments_pop_1"),
|
||||
{`\n`, Text, Pop(1)},
|
||||
Include("base"),
|
||||
},
|
||||
"subdirective": {
|
||||
{`\{(?=\s)`, Punctuation, Push("block")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(1)},
|
||||
{`\s*\n`, Text, Pop(1)},
|
||||
Include("comments_pop_1"),
|
||||
{`\n`, Text, Pop(1)},
|
||||
Include("base"),
|
||||
},
|
||||
"arguments": {
|
||||
{`\{(?=\s)`, Punctuation, Push("block")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(2)},
|
||||
Include("comments_pop_2"),
|
||||
{`\\\n`, Text, nil}, // Skip escaped newlines
|
||||
{`\s*\n`, Text, Pop(2)},
|
||||
Include("base"),
|
||||
},
|
||||
"nested_arguments": {
|
||||
{`\{(?=\s)`, Punctuation, Push("nested_block")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(2)},
|
||||
{`\\\n`, Text, nil}, // Skip escaped newlines
|
||||
{`\s*\n`, Text, Pop(2)},
|
||||
{`\n`, Text, Pop(2)},
|
||||
Include("base"),
|
||||
},
|
||||
"deep_subdirective": {
|
||||
{`\{(?=\s)`, Punctuation, Push("block")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(3)},
|
||||
{`\s*\n`, Text, Pop(3)},
|
||||
Include("comments_pop_3"),
|
||||
{`\n`, Text, Pop(3)},
|
||||
Include("base"),
|
||||
},
|
||||
"uri_directive": {
|
||||
{`\{(?=\s)`, Punctuation, Push("block")},
|
||||
{caddyfileMatcherTokenRegexp, NameDecorator, nil},
|
||||
{`(strip_prefix|strip_suffix|replace|path_regexp)`, NameConstant, Push("arguments")},
|
||||
{caddyfileCommentRegexp, CommentSingle, Pop(1)},
|
||||
{`\s*\n`, Text, Pop(1)},
|
||||
Include("base"),
|
||||
"matcher_token": {
|
||||
{`@[^\s]+`, NameDecorator, Push("arguments")}, // Named matcher
|
||||
{`/[^\s]+`, NameDecorator, Push("arguments")}, // Path matcher
|
||||
{`\*`, NameDecorator, Push("arguments")}, // Wildcard path matcher
|
||||
{`\[\<matcher\>\]`, NameDecorator, Push("arguments")}, // Matcher token stub for docs
|
||||
},
|
||||
"double_quotes": {
|
||||
Include("placeholder"),
|
||||
{`\\"`, StringDouble, nil},
|
||||
{`[^"]`, StringDouble, nil},
|
||||
{`"`, StringDouble, Pop(1)},
|
||||
"comments": {
|
||||
{`^#.*\n`, CommentSingle, nil}, // Comment at start of line
|
||||
{`\s+#.*\n`, CommentSingle, nil}, // Comment preceded by whitespace
|
||||
},
|
||||
"backticks": {
|
||||
Include("placeholder"),
|
||||
{"\\\\`", StringBacktick, nil},
|
||||
{"[^`]", StringBacktick, nil},
|
||||
{"`", StringBacktick, Pop(1)},
|
||||
"comments_pop_1": {
|
||||
{`^#.*\n`, CommentSingle, Pop(1)}, // Comment at start of line
|
||||
{`\s+#.*\n`, CommentSingle, Pop(1)}, // Comment preceded by whitespace
|
||||
},
|
||||
"optional": {
|
||||
// Docs syntax for showing optional parts with [ ]
|
||||
{`\[`, Punctuation, Push("optional")},
|
||||
Include("name_constants"),
|
||||
{`\|`, Punctuation, nil},
|
||||
{`[^\[\]\|]+`, String, nil},
|
||||
{`\]`, Punctuation, Pop(1)},
|
||||
"comments_pop_2": {
|
||||
{`^#.*\n`, CommentSingle, Pop(2)}, // Comment at start of line
|
||||
{`\s+#.*\n`, CommentSingle, Pop(2)}, // Comment preceded by whitespace
|
||||
},
|
||||
"heredoc": {
|
||||
{`(<<([a-zA-Z0-9_-]+))(\n(.*|\n)*)(\s*)(\2)`, ByGroups(StringHeredoc, nil, String, String, String, StringHeredoc), nil},
|
||||
},
|
||||
"name_constants": {
|
||||
{`\b(most_recently_modified|largest_size|smallest_size|first_exist|internal|disable_redirects|ignore_loaded_certs|disable_certs|private_ranges|first|last|before|after|on|off)\b(\||(?=\]|\s|$))`, ByGroups(NameConstant, Punctuation), nil},
|
||||
},
|
||||
"placeholder": {
|
||||
// Placeholder with dots, colon for default value, brackets for args[0:]
|
||||
{`\{[\w+.\[\]\:\$-]+\}`, StringEscape, nil},
|
||||
// Handle opening brackets with no matching closing one
|
||||
{`\{[^\}\s]*\b`, String, nil},
|
||||
"comments_pop_3": {
|
||||
{`^#.*\n`, CommentSingle, Pop(3)}, // Comment at start of line
|
||||
{`\s+#.*\n`, CommentSingle, Pop(3)}, // Comment preceded by whitespace
|
||||
},
|
||||
"base": {
|
||||
{caddyfileCommentRegexp, CommentSingle, nil},
|
||||
{`\[\<matcher\>\]`, NameDecorator, nil},
|
||||
Include("name_constants"),
|
||||
Include("heredoc"),
|
||||
{`(https?://)?([a-z0-9.-]+)(:)([0-9]+)([^\s]*)`, ByGroups(Name, Name, Punctuation, NumberInteger, Name), nil},
|
||||
{`\[`, Punctuation, Push("optional")},
|
||||
{"`", StringBacktick, Push("backticks")},
|
||||
{`"`, StringDouble, Push("double_quotes")},
|
||||
Include("placeholder"),
|
||||
{`[a-z-]+/[a-z-+]+`, String, nil},
|
||||
{`[0-9]+([smhdk]|ns|us|µs|ms)?\b`, NumberInteger, nil},
|
||||
{`[^\s\n#\{]+`, String, nil},
|
||||
Include("comments"),
|
||||
{`(on|off|first|last|before|after|internal|strip_prefix|strip_suffix|replace)\b`, NameConstant, nil},
|
||||
{`(https?://)?([a-z0-9.-]+)(:)([0-9]+)`, ByGroups(Name, Name, Punctuation, LiteralNumberInteger), nil},
|
||||
{`[a-z-]+/[a-z-+]+`, LiteralString, nil},
|
||||
{`[0-9]+[km]?\b`, LiteralNumberInteger, nil},
|
||||
{`\{[\w+.\$-]+\}`, LiteralStringEscape, nil}, // Placeholder
|
||||
{`\[(?=[^#{}$]+\])`, Punctuation, nil},
|
||||
{`\]|\|`, Punctuation, nil},
|
||||
{`[^\s#{}$\]]+`, LiteralString, nil},
|
||||
{`/[^\s#]*`, Name, nil},
|
||||
{`\s+`, Text, nil},
|
||||
},
|
||||
@@ -207,29 +149,27 @@ var Caddyfile = Register(MustNewLexer(
|
||||
func caddyfileRules() Rules {
|
||||
return Rules{
|
||||
"root": {
|
||||
{caddyfileCommentRegexp, CommentSingle, nil},
|
||||
Include("comments"),
|
||||
// Global options block
|
||||
{`^\s*(\{)\s*$`, ByGroups(Punctuation), Push("globals")},
|
||||
// Top level import
|
||||
{`(import)(\s+)([^\s]+)`, ByGroups(Keyword, Text, NameVariableMagic), nil},
|
||||
// Snippets
|
||||
{`(&?\([^\s#]+\))(\s*)(\{)`, ByGroups(NameVariableAnonymous, Text, Punctuation), Push("snippet")},
|
||||
{`(\([^\s#]+\))(\s*)(\{)`, ByGroups(NameVariableAnonymous, Text, Punctuation), Push("snippet")},
|
||||
// Site label
|
||||
{`[^#{(\s,]+`, GenericHeading, Push("label")},
|
||||
// Site label with placeholder
|
||||
{`\{[\w+.\[\]\:\$-]+\}`, StringEscape, Push("label")},
|
||||
{`\{[\w+.\$-]+\}`, LiteralStringEscape, Push("label")},
|
||||
{`\s+`, Text, nil},
|
||||
},
|
||||
"globals": {
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
// Global options are parsed as subdirectives (no matcher)
|
||||
{`[^\s#]+`, Keyword, Push("subdirective")},
|
||||
{`[^\s#]+`, Keyword, Push("directive")},
|
||||
Include("base"),
|
||||
},
|
||||
"snippet": {
|
||||
{`\}`, Punctuation, Pop(1)},
|
||||
Include("site_body"),
|
||||
// Any other directive
|
||||
// Matcher definition
|
||||
{`@[^\s]+(?=\s)`, NameDecorator, Push("matcher")},
|
||||
// Any directive
|
||||
{`[^\s#]+`, Keyword, Push("directive")},
|
||||
Include("base"),
|
||||
},
|
||||
@@ -239,7 +179,7 @@ func caddyfileRules() Rules {
|
||||
{`,\s*\n?`, Text, nil},
|
||||
{` `, Text, nil},
|
||||
// Site label with placeholder
|
||||
Include("placeholder"),
|
||||
{`\{[\w+.\$-]+\}`, LiteralStringEscape, nil},
|
||||
// Site label
|
||||
{`[^#{(\s,]+`, GenericHeading, nil},
|
||||
// Comment after non-block label (hack because comments end in \n)
|
||||
|
||||
4
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/c#.xml
generated
vendored
4
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/c#.xml
generated
vendored
@@ -19,10 +19,10 @@
|
||||
<rule pattern="\\\n">
|
||||
<token type="Text"/>
|
||||
</rule>
|
||||
<rule pattern="///[^\n\r]*">
|
||||
<rule pattern="///[^\n\r]+">
|
||||
<token type="CommentSpecial"/>
|
||||
</rule>
|
||||
<rule pattern="//[^\n\r]*">
|
||||
<rule pattern="//[^\n\r]+">
|
||||
<token type="CommentSingle"/>
|
||||
</rule>
|
||||
<rule pattern="/[*].*?[*]/">
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/cue.xml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/cue.xml
generated
vendored
@@ -49,7 +49,7 @@
|
||||
<rule pattern="(true|false|null|_)\b">
|
||||
<token type="KeywordConstant"/>
|
||||
</rule>
|
||||
<rule pattern="[@#]?[_a-zA-Z$]\w*">
|
||||
<rule pattern="[_a-zA-Z]\w*">
|
||||
<token type="Name"/>
|
||||
</rule>
|
||||
</state>
|
||||
|
||||
17
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/desktop_entry.xml
generated
vendored
17
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/desktop_entry.xml
generated
vendored
@@ -1,17 +0,0 @@
|
||||
<lexer>
|
||||
<config>
|
||||
<name>Desktop file</name>
|
||||
<alias>desktop</alias>
|
||||
<alias>desktop_entry</alias>
|
||||
<filename>*.desktop</filename>
|
||||
<mime_type>application/x-desktop</mime_type>
|
||||
</config>
|
||||
<rules>
|
||||
<state name="root">
|
||||
<rule pattern="^[ \t]*\n"><token type="TextWhitespace"/></rule>
|
||||
<rule pattern="^(#.*)(\n)"><bygroups><token type="CommentSingle"/><token type="TextWhitespace"/></bygroups></rule>
|
||||
<rule pattern="(\[[^\]\n]+\])(\n)"><bygroups><token type="Keyword"/><token type="TextWhitespace"/></bygroups></rule>
|
||||
<rule pattern="([-A-Za-z0-9]+)(\[[^\] \t=]+\])?([ \t]*)(=)([ \t]*)([^\n]*)([ \t\n]*\n)"><bygroups><token type="NameAttribute"/><token type="NameNamespace"/><token type="TextWhitespace"/><token type="Operator"/><token type="TextWhitespace"/><token type="LiteralString"/><token type="TextWhitespace"/></bygroups></rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
117
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/gleam.xml
generated
vendored
117
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/gleam.xml
generated
vendored
@@ -1,117 +0,0 @@
|
||||
<lexer>
|
||||
<config>
|
||||
<name>Gleam</name>
|
||||
<alias>gleam></alias>
|
||||
<filename>*.gleam</filename>
|
||||
<mime_type>text/x-gleam</mime_type>
|
||||
</config>
|
||||
<rules>
|
||||
<state name="root">
|
||||
<rule pattern="\s+">
|
||||
<token type="TextWhitespace"/>
|
||||
</rule>
|
||||
<rule pattern="///(.*?)\n">
|
||||
<token type="LiteralStringDoc"/>
|
||||
</rule>
|
||||
<rule pattern="//(.*?)\n">
|
||||
<token type="CommentSingle"/>
|
||||
</rule>
|
||||
<rule pattern="(as|assert|case|opaque|panic|pub|todo)\b">
|
||||
<token type="Keyword"/>
|
||||
</rule>
|
||||
<rule pattern="(import|use)\b">
|
||||
<token type="KeywordNamespace"/>
|
||||
</rule>
|
||||
<rule pattern="(auto|const|delegate|derive|echo|else|if|implement|macro|test)\b">
|
||||
<token type="KeywordReserved"/>
|
||||
</rule>
|
||||
<rule pattern="(let)\b">
|
||||
<token type="KeywordDeclaration"/>
|
||||
</rule>
|
||||
<rule pattern="(fn)\b">
|
||||
<token type="Keyword"/>
|
||||
</rule>
|
||||
<rule pattern="(type)\b">
|
||||
<token type="Keyword"/>
|
||||
<push state="typename"/>
|
||||
</rule>
|
||||
<rule pattern="(True|False)\b">
|
||||
<token type="KeywordConstant"/>
|
||||
</rule>
|
||||
<rule pattern="0[bB][01](_?[01])*">
|
||||
<token type="LiteralNumberBin"/>
|
||||
</rule>
|
||||
<rule pattern="0[oO][0-7](_?[0-7])*">
|
||||
<token type="LiteralNumberOct"/>
|
||||
</rule>
|
||||
<rule pattern="0[xX][\da-fA-F](_?[\dA-Fa-f])*">
|
||||
<token type="LiteralNumberHex"/>
|
||||
</rule>
|
||||
<rule pattern="\d(_?\d)*\.\d(_?\d)*([eE][-+]?\d(_?\d)*)?">
|
||||
<token type="LiteralNumberFloat"/>
|
||||
</rule>
|
||||
<rule pattern="\d(_?\d)*">
|
||||
<token type="LiteralNumberInteger"/>
|
||||
</rule>
|
||||
<rule pattern=""">
|
||||
<token type="LiteralString"/>
|
||||
<push state="string"/>
|
||||
</rule>
|
||||
<rule pattern="@([a-z_]\w*[!?]?)">
|
||||
<token type="NameAttribute"/>
|
||||
</rule>
|
||||
<rule pattern="[{}()\[\],]|[#(]|\.\.|<>|<<|>>">
|
||||
<token type="Punctuation"/>
|
||||
</rule>
|
||||
<rule pattern="[+\-*/%!=<>&|.]|<-">
|
||||
<token type="Operator"/>
|
||||
</rule>
|
||||
<rule pattern=":|->">
|
||||
<token type="Operator"/>
|
||||
<push state="typename"/>
|
||||
</rule>
|
||||
<rule pattern="([a-z_][A-Za-z0-9_]*)(\()">
|
||||
<bygroups>
|
||||
<token type="NameFunction"/>
|
||||
<token type="Punctuation"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="([A-Z][A-Za-z0-9_]*)(\()">
|
||||
<bygroups>
|
||||
<token type="NameClass"/>
|
||||
<token type="Punctuation"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="([a-z_]\w*[!?]?)">
|
||||
<token type="Name"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="typename">
|
||||
<rule pattern="\s+">
|
||||
<token type="TextWhitespace"/>
|
||||
</rule>
|
||||
<rule pattern="[A-Z][A-Za-z0-9_]*">
|
||||
<token type="NameClass"/>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
<rule>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="string">
|
||||
<rule pattern=""">
|
||||
<token type="LiteralString"/>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
<rule pattern="\\["\\fnrt]|\\u\{[\da-fA-F]{1,6}\}">
|
||||
<token type="LiteralStringEscape"/>
|
||||
</rule>
|
||||
<rule pattern="[^\\"]+">
|
||||
<token type="LiteralString"/>
|
||||
</rule>
|
||||
<rule pattern="\\">
|
||||
<token type="LiteralString"/>
|
||||
</rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
2
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/haskell.xml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/haskell.xml
generated
vendored
@@ -86,7 +86,7 @@
|
||||
<rule pattern="\\(?![:!#$%&*+.\\/<=>?@^|~-]+)">
|
||||
<token type="NameFunction"/>
|
||||
</rule>
|
||||
<rule pattern="(<-|::|->|=>|=|'([:!#$%&*+.\\/<=>?@^|~-]+))(?![:!#$%&*+.\\/<=>?@^|~-]+)">
|
||||
<rule pattern="(<-|::|->|=>|=)(?![:!#$%&*+.\\/<=>?@^|~-]+)">
|
||||
<token type="OperatorWord"/>
|
||||
</rule>
|
||||
<rule pattern=":[:!#$%&*+.\\/<=>?@^|~-]*">
|
||||
|
||||
1
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/json.xml
generated
vendored
1
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/json.xml
generated
vendored
@@ -3,7 +3,6 @@
|
||||
<name>JSON</name>
|
||||
<alias>json</alias>
|
||||
<filename>*.json</filename>
|
||||
<filename>*.avsc</filename>
|
||||
<mime_type>application/json</mime_type>
|
||||
<dot_all>true</dot_all>
|
||||
<not_multiline>true</not_multiline>
|
||||
|
||||
155
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/materialize_sql_dialect.xml
generated
vendored
155
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/materialize_sql_dialect.xml
generated
vendored
@@ -1,155 +0,0 @@
|
||||
<lexer>
|
||||
<config>
|
||||
<name>Materialize SQL dialect</name>
|
||||
<alias>materialize</alias>
|
||||
<alias>mzsql</alias>
|
||||
<mime_type>text/x-materializesql</mime_type>
|
||||
<case_insensitive>true</case_insensitive>
|
||||
<not_multiline>true</not_multiline>
|
||||
</config>
|
||||
<rules>
|
||||
<state name="root">
|
||||
<rule pattern="\s+">
|
||||
<token type="Text"/>
|
||||
</rule>
|
||||
<rule pattern="--.*\n?">
|
||||
<token type="CommentSingle"/>
|
||||
</rule>
|
||||
<rule pattern="/\*">
|
||||
<token type="CommentMultiline"/>
|
||||
<push state="multiline-comments"/>
|
||||
</rule>
|
||||
<rule pattern="(bigint|bigserial|bit|bit\s+varying|bool|boolean|box|bytea|char|character|character\s+varying|cidr|circle|date|decimal|double\s+precision|float4|float8|inet|int|int2|int4|int8|integer|interval|json|jsonb|line|lseg|macaddr|money|numeric|path|pg_lsn|point|polygon|real|serial|serial2|serial4|serial8|smallint|smallserial|text|time|timestamp|timestamptz|timetz|tsquery|tsvector|txid_snapshot|uuid|varbit|varchar|with\s+time\s+zone|without\s+time\s+zone|xml|anyarray|anyelement|anyenum|anynonarray|anyrange|cstring|fdw_handler|internal|language_handler|opaque|record|void)\b">
|
||||
<token type="NameBuiltin"/>
|
||||
</rule>
|
||||
<rule pattern="(?s)(DO)(\s+)(?:(LANGUAGE)?(\s+)('?)(\w+)?('?)(\s+))?(\$)([^$]*)(\$)(.*?)(\$)(\10)(\$)">
|
||||
<usingbygroup>
|
||||
<sublexer_name_group>6</sublexer_name_group>
|
||||
<code_group>12</code_group>
|
||||
<emitters>
|
||||
<token type="Keyword"/>
|
||||
<token type="Text"/>
|
||||
<token type="Keyword"/>
|
||||
<token type="Text"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
<token type="Text"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
</emitters>
|
||||
</usingbygroup>
|
||||
</rule>
|
||||
<rule pattern="(ACCESS|ACKS|ADD|ADDRESSES|AGGREGATE|ALL|ALTER|AND|ANY|ARN|ARRANGEMENT|ARRAY|AS|ASC|ASSERT|AT|AUCTION|AUTHORITY|AVAILABILITY|AVRO|AWS|BEGIN|BETWEEN|BIGINT|BILLED|BODY|BOOLEAN|BOTH|BPCHAR|BROKEN|BROKER|BROKERS|BY|BYTES|CARDINALITY|CASCADE|CASE|CAST|CERTIFICATE|CHAIN|CHAR|CHARACTER|CHARACTERISTICS|CHECK|CLIENT|CLOSE|CLUSTER|CLUSTERS|COALESCE|COLLATE|COLUMN|COLUMNS|COMMENT|COMMIT|COMMITTED|COMPACTION|COMPRESSION|COMPUTE|COMPUTECTL|CONFLUENT|CONNECTION|CONNECTIONS|CONSTRAINT|COPY|COUNT|COUNTER|CREATE|CREATECLUSTER|CREATEDB|CREATEROLE|CROSS|CSV|CURRENT|CURSOR|DATABASE|DATABASES|DATUMS|DAY|DAYS|DEALLOCATE|DEBEZIUM|DEBUG|DEBUGGING|DEC|DECIMAL|DECLARE|DECORRELATED|DEFAULT|DEFAULTS|DELETE|DELIMITED|DELIMITER|DESC|DETAILS|DISCARD|DISK|DISTINCT|DOC|DOT|DOUBLE|DROP|EFFORT|ELEMENT|ELSE|ENABLE|END|ENDPOINT|ENFORCED|ENVELOPE|ERROR|ESCAPE|EXCEPT|EXECUTE|EXISTS|EXPECTED|EXPLAIN|EXPOSE|EXTRACT|FACTOR|FALSE|FETCH|FIELDS|FILTER|FIRST|FLOAT|FOLLOWING|FOR|FOREIGN|FORMAT|FORWARD|FROM|FULL|FULLNAME|FUNCTION|GENERATOR|GRANT|GREATEST|GROUP|GROUPS|HAVING|HEADER|HEADERS|HOLD|HOST|HOUR|HOURS|ID|IDEMPOTENCE|IDLE|IF|IGNORE|ILIKE|IN|INCLUDE|INDEX|INDEXES|INFO|INHERIT|INLINE|INNER|INPUT|INSERT|INSPECT|INT|INTEGER|INTERNAL|INTERSECT|INTERVAL|INTO|INTROSPECTION|IS|ISNULL|ISOLATION|JOIN|JSON|KAFKA|KEY|KEYS|LAST|LATERAL|LATEST|LEADING|LEAST|LEFT|LEVEL|LIKE|LIMIT|LIST|LOAD|LOCAL|LOG|LOGICAL|LOGIN|MANAGED|MAP|MARKETING|MATERIALIZE|MATERIALIZED|MAX|MECHANISMS|MEMBERSHIP|MERGE|MESSAGE|METADATA|MINUTE|MINUTES|MODE|MONTH|MONTHS|MS|MUTUALLY|NAME|NAMES|NATURAL|NEXT|NO|NOCREATECLUSTER|NOCREATEDB|NOCREATEROLE|NOINHERIT|NOLOGIN|NONE|NOSUPERUSER|NOT|NOTICE|NULL|NULLIF|NULLS|OBJECTS|OF|OFFSET|ON|ONLY|OPERATOR|OPTIMIZED|OPTIMIZER|OPTIONS|OR|ORDER|ORDINALITY|OUTER|OVER|OWNED|OWNER|PARTITION|PASSWORD|PHYSICAL|PLAN|PLANS|PORT|POSITION|POSTGRES|PRECEDING|PRECISION|PREFIX|PREPARE|PRIMARY|PRIVATELINK|PRIVILEGES|PROGRESS|PROTOBUF|PROTOCOL|PUBLICATION|QUERY|QUOTE|RAISE|RANGE|RAW|READ|REAL|REASSIGN|RECURSION|RECURSIVE|REFERENCES|REFRESH|REGEX|REGION|REGISTRY|RENAME|REPEATABLE|REPLACE|REPLICA|REPLICAS|REPLICATION|RESET|RESPECT|RESTRICT|RETENTION|RETURN|RETURNING|REVOKE|RIGHT|ROLE|ROLES|ROLLBACK|ROTATE|ROW|ROWS|SASL|SCALE|SCHEMA|SCHEMAS|SCRIPT|SECOND|SECONDS|SECRET|SECRETS|SECURITY|SEED|SELECT|SEQUENCES|SERIALIZABLE|SERVICE|SESSION|SET|SHARD|SHOW|SINK|SINKS|SIZE|SMALLINT|SNAPSHOT|SOME|SOURCE|SOURCES|SSH|SSL|START|STDIN|STDOUT|STORAGE|STORAGECTL|STRATEGY|STRICT|STRING|SUBSCRIBE|SUBSOURCE|SUBSOURCES|SUBSTRING|SUPERUSER|SWAP|SYSTEM|TABLE|TABLES|TAIL|TEMP|TEMPORARY|TEST|TEXT|THEN|TICK|TIES|TIME|TIMELINE|TIMEOUT|TIMESTAMP|TIMESTAMPTZ|TO|TOKEN|TOPIC|TPCH|TRACE|TRAILING|TRANSACTION|TRIM|TRUE|TUNNEL|TYPE|TYPES|UNBOUNDED|UNCOMMITTED|UNION|UNIQUE|UNKNOWN|UP|UPDATE|UPSERT|URL|USAGE|USER|USERNAME|USERS|USING|VALIDATE|VALUE|VALUES|VARCHAR|VARYING|VIEW|VIEWS|WARNING|WEBHOOK|WHEN|WHERE|WINDOW|WIRE|WITH|WITHIN|WITHOUT|WORK|WORKERS|WRITE|YEAR|YEARS|ZONE|ZONES)\b">
|
||||
<token type="Keyword"/>
|
||||
</rule>
|
||||
<rule pattern="[+*/<>=~!@#%^&|`?-]+">
|
||||
<token type="Operator"/>
|
||||
</rule>
|
||||
<rule pattern="::">
|
||||
<token type="Operator"/>
|
||||
</rule>
|
||||
<rule pattern="\$\d+">
|
||||
<token type="NameVariable"/>
|
||||
</rule>
|
||||
<rule pattern="([0-9]*\.[0-9]*|[0-9]+)(e[+-]?[0-9]+)?">
|
||||
<token type="LiteralNumberFloat"/>
|
||||
</rule>
|
||||
<rule pattern="[0-9]+">
|
||||
<token type="LiteralNumberInteger"/>
|
||||
</rule>
|
||||
<rule pattern="((?:E|U&)?)(')">
|
||||
<bygroups>
|
||||
<token type="LiteralStringAffix"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
</bygroups>
|
||||
<push state="string"/>
|
||||
</rule>
|
||||
<rule pattern="((?:U&)?)(")">
|
||||
<bygroups>
|
||||
<token type="LiteralStringAffix"/>
|
||||
<token type="LiteralStringName"/>
|
||||
</bygroups>
|
||||
<push state="quoted-ident"/>
|
||||
</rule>
|
||||
<rule pattern="(?s)(\$)([^$]*)(\$)(.*?)(\$)(\2)(\$)(\s+)(LANGUAGE)?(\s+)('?)(\w+)?('?)">
|
||||
<usingbygroup>
|
||||
<sublexer_name_group>12</sublexer_name_group>
|
||||
<code_group>4</code_group>
|
||||
<emitters>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
<token type="Text"/>
|
||||
<token type="Keyword"/>
|
||||
<token type="Text"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
<token type="LiteralStringSingle"/>
|
||||
</emitters>
|
||||
</usingbygroup>
|
||||
</rule>
|
||||
<rule pattern="(?s)(\$)([^$]*)(\$)(.*?)(\$)(\2)(\$)">
|
||||
<token type="LiteralStringHeredoc"/>
|
||||
</rule>
|
||||
<rule pattern="[a-z_]\w*">
|
||||
<token type="Name"/>
|
||||
</rule>
|
||||
<rule pattern=":(['"]?)[a-z]\w*\b\1">
|
||||
<token type="NameVariable"/>
|
||||
</rule>
|
||||
<rule pattern="[;:()\[\]{},.]">
|
||||
<token type="Punctuation"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="multiline-comments">
|
||||
<rule pattern="/\*">
|
||||
<token type="CommentMultiline"/>
|
||||
<push state="multiline-comments"/>
|
||||
</rule>
|
||||
<rule pattern="\*/">
|
||||
<token type="CommentMultiline"/>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
<rule pattern="[^/*]+">
|
||||
<token type="CommentMultiline"/>
|
||||
</rule>
|
||||
<rule pattern="[/*]">
|
||||
<token type="CommentMultiline"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="string">
|
||||
<rule pattern="[^']+">
|
||||
<token type="LiteralStringSingle"/>
|
||||
</rule>
|
||||
<rule pattern="''">
|
||||
<token type="LiteralStringSingle"/>
|
||||
</rule>
|
||||
<rule pattern="'">
|
||||
<token type="LiteralStringSingle"/>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="quoted-ident">
|
||||
<rule pattern="[^"]+">
|
||||
<token type="LiteralStringName"/>
|
||||
</rule>
|
||||
<rule pattern="""">
|
||||
<token type="LiteralStringName"/>
|
||||
</rule>
|
||||
<rule pattern=""">
|
||||
<token type="LiteralStringName"/>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
123
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/ndisasm.xml
generated
vendored
123
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/ndisasm.xml
generated
vendored
@@ -1,123 +0,0 @@
|
||||
<lexer>
|
||||
<config>
|
||||
<name>NDISASM</name>
|
||||
<alias>ndisasm</alias>
|
||||
<mime_type>text/x-disasm</mime_type>
|
||||
<case_insensitive>true</case_insensitive>
|
||||
<priority>0.5</priority> <!-- Lower than NASM -->
|
||||
</config>
|
||||
<rules>
|
||||
<state name="root">
|
||||
<rule pattern="^[0-9A-Za-z]+">
|
||||
<token type="CommentSpecial"/>
|
||||
<push state="offset"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="offset">
|
||||
<rule pattern="[0-9A-Za-z]+">
|
||||
<token type="CommentSpecial"/>
|
||||
<push state="assembly"/>
|
||||
</rule>
|
||||
<rule>
|
||||
<include state="whitespace"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="punctuation">
|
||||
<rule pattern="[,():\[\]]+">
|
||||
<token type="Punctuation"/>
|
||||
</rule>
|
||||
<rule pattern="[&|^<>+*/%~-]+">
|
||||
<token type="Operator"/>
|
||||
</rule>
|
||||
<rule pattern="[$]+">
|
||||
<token type="KeywordConstant"/>
|
||||
</rule>
|
||||
<rule pattern="seg|wrt|strict">
|
||||
<token type="OperatorWord"/>
|
||||
</rule>
|
||||
<rule pattern="byte|[dq]?word">
|
||||
<token type="KeywordType"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="assembly">
|
||||
<rule>
|
||||
<include state="whitespace"/>
|
||||
</rule>
|
||||
<rule pattern="[a-z$._?][\w$.?#@~]*:">
|
||||
<token type="NameLabel"/>
|
||||
</rule>
|
||||
<rule pattern="([a-z$._?][\w$.?#@~]*)(\s+)(equ)">
|
||||
<bygroups>
|
||||
<token type="NameConstant"/>
|
||||
<token type="KeywordDeclaration"/>
|
||||
<token type="KeywordDeclaration"/>
|
||||
</bygroups>
|
||||
<push state="instruction-args"/>
|
||||
</rule>
|
||||
<rule pattern="BITS|USE16|USE32|SECTION|SEGMENT|ABSOLUTE|EXTERN|GLOBAL|ORG|ALIGN|STRUC|ENDSTRUC|COMMON|CPU|GROUP|UPPERCASE|IMPORT|EXPORT|LIBRARY|MODULE">
|
||||
<token type="Keyword"/>
|
||||
<push state="instruction-args"/>
|
||||
</rule>
|
||||
<rule pattern="(?:res|d)[bwdqt]|times">
|
||||
<token type="KeywordDeclaration"/>
|
||||
<push state="instruction-args"/>
|
||||
</rule>
|
||||
<rule pattern="[a-z$._?][\w$.?#@~]*">
|
||||
<token type="NameFunction"/>
|
||||
<push state="instruction-args"/>
|
||||
</rule>
|
||||
<rule pattern="[\r\n]+">
|
||||
<token type="Text"/>
|
||||
<pop depth="2"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="instruction-args">
|
||||
<rule pattern=""(\\"|[^"\n])*"|'(\\'|[^'\n])*'|`(\\`|[^`\n])*`">
|
||||
<token type="LiteralString"/>
|
||||
</rule>
|
||||
<rule pattern="(?:0x[0-9a-f]+|$0[0-9a-f]*|[0-9]+[0-9a-f]*h)">
|
||||
<token type="LiteralNumberHex"/>
|
||||
</rule>
|
||||
<rule pattern="[0-7]+q">
|
||||
<token type="LiteralNumberOct"/>
|
||||
</rule>
|
||||
<rule pattern="[01]+b">
|
||||
<token type="LiteralNumberBin"/>
|
||||
</rule>
|
||||
<rule pattern="[0-9]+\.e?[0-9]+">
|
||||
<token type="LiteralNumberFloat"/>
|
||||
</rule>
|
||||
<rule pattern="[0-9]+">
|
||||
<token type="LiteralNumberInteger"/>
|
||||
</rule>
|
||||
<rule>
|
||||
<include state="punctuation"/>
|
||||
</rule>
|
||||
<rule pattern="r[0-9][0-5]?[bwd]|[a-d][lh]|[er]?[a-d]x|[er]?[sb]p|[er]?[sd]i|[c-gs]s|st[0-7]|mm[0-7]|cr[0-4]|dr[0-367]|tr[3-7]">
|
||||
<token type="NameBuiltin"/>
|
||||
</rule>
|
||||
<rule pattern="[a-z$._?][\w$.?#@~]*">
|
||||
<token type="NameVariable"/>
|
||||
</rule>
|
||||
<rule pattern="[\r\n]+">
|
||||
<token type="Text"/>
|
||||
<pop depth="3"/>
|
||||
</rule>
|
||||
<rule>
|
||||
<include state="whitespace"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="whitespace">
|
||||
<rule pattern="\n">
|
||||
<token type="Text"/>
|
||||
<pop depth="2"/>
|
||||
</rule>
|
||||
<rule pattern="[ \t]+">
|
||||
<token type="Text"/>
|
||||
</rule>
|
||||
<rule pattern=";.*">
|
||||
<token type="CommentSingle"/>
|
||||
</rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
12
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/org_mode.xml
generated
vendored
12
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/org_mode.xml
generated
vendored
@@ -228,42 +228,42 @@
|
||||
</rule>
|
||||
</state>
|
||||
<state name="inline">
|
||||
<rule pattern="(\s*)(\*[^ \n*][^*]+?[^ \n*]\*)((?=\W|\n|$))">
|
||||
<rule pattern="(\s)*(\*[^ \n*][^*]+?[^ \n*]\*)((?=\W|\n|$))">
|
||||
<bygroups>
|
||||
<token type="Text"/>
|
||||
<token type="GenericStrong"/>
|
||||
<token type="Text"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(\s*)(/[^/]+?/)((?=\W|\n|$))">
|
||||
<rule pattern="(\s)*(/[^/]+?/)((?=\W|\n|$))">
|
||||
<bygroups>
|
||||
<token type="Text"/>
|
||||
<token type="GenericEmph"/>
|
||||
<token type="Text"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(\s*)(=[^\n=]+?=)((?=\W|\n|$))">
|
||||
<rule pattern="(\s)*(=[^\n=]+?=)((?=\W|\n|$))">
|
||||
<bygroups>
|
||||
<token type="Text"/>
|
||||
<token type="NameClass"/>
|
||||
<token type="Text"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(\s*)(~[^\n~]+?~)((?=\W|\n|$))">
|
||||
<rule pattern="(\s)*(~[^\n~]+?~)((?=\W|\n|$))">
|
||||
<bygroups>
|
||||
<token type="Text"/>
|
||||
<token type="NameClass"/>
|
||||
<token type="Text"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(\s*)(\+[^+]+?\+)((?=\W|\n|$))">
|
||||
<rule pattern="(\s)*(\+[^+]+?\+)((?=\W|\n|$))">
|
||||
<bygroups>
|
||||
<token type="Text"/>
|
||||
<token type="GenericDeleted"/>
|
||||
<token type="Text"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(\s*)(_[^_]+?_)((?=\W|\n|$))">
|
||||
<rule pattern="(\s)*(_[^_]+?_)((?=\W|\n|$))">
|
||||
<bygroups>
|
||||
<token type="Text"/>
|
||||
<token type="GenericUnderline"/>
|
||||
|
||||
119
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/promela.xml
generated
vendored
119
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/promela.xml
generated
vendored
@@ -1,119 +0,0 @@
|
||||
|
||||
<lexer>
|
||||
<config>
|
||||
<name>Promela</name>
|
||||
<alias>promela</alias>
|
||||
<filename>*.pml</filename>
|
||||
<filename>*.prom</filename>
|
||||
<filename>*.prm</filename>
|
||||
<filename>*.promela</filename>
|
||||
<filename>*.pr</filename>
|
||||
<filename>*.pm</filename>
|
||||
<mime_type>text/x-promela</mime_type>
|
||||
</config>
|
||||
<rules>
|
||||
<state name="statements">
|
||||
<rule pattern="(\[\]|<>|/\\|\\/)|(U|W|V)\b"><token type="Operator"/></rule>
|
||||
<rule pattern="@"><token type="Punctuation"/></rule>
|
||||
<rule pattern="(\.)([a-zA-Z_]\w*)"><bygroups><token type="Operator"/><token type="NameAttribute"/></bygroups></rule>
|
||||
<rule><include state="keywords"/></rule>
|
||||
<rule><include state="types"/></rule>
|
||||
<rule pattern="([LuU]|u8)?(")"><bygroups><token type="LiteralStringAffix"/><token type="LiteralString"/></bygroups><push state="string"/></rule>
|
||||
<rule pattern="([LuU]|u8)?(')(\\.|\\[0-7]{1,3}|\\x[a-fA-F0-9]{1,2}|[^\\\'\n])(')"><bygroups><token type="LiteralStringAffix"/><token type="LiteralStringChar"/><token type="LiteralStringChar"/><token type="LiteralStringChar"/></bygroups></rule>
|
||||
<rule pattern="0[xX]([0-9a-fA-F](\'?[0-9a-fA-F])*\.[0-9a-fA-F](\'?[0-9a-fA-F])*|\.[0-9a-fA-F](\'?[0-9a-fA-F])*|[0-9a-fA-F](\'?[0-9a-fA-F])*)[pP][+-]?[0-9a-fA-F](\'?[0-9a-fA-F])*[lL]?"><token type="LiteralNumberFloat"/></rule>
|
||||
<rule pattern="(-)?(\d(\'?\d)*\.\d(\'?\d)*|\.\d(\'?\d)*|\d(\'?\d)*)[eE][+-]?\d(\'?\d)*[fFlL]?"><token type="LiteralNumberFloat"/></rule>
|
||||
<rule pattern="(-)?((\d(\'?\d)*\.(\d(\'?\d)*)?|\.\d(\'?\d)*)[fFlL]?)|(\d(\'?\d)*[fFlL])"><token type="LiteralNumberFloat"/></rule>
|
||||
<rule pattern="(-)?0[xX][0-9a-fA-F](\'?[0-9a-fA-F])*(([uU][lL]{0,2})|[lL]{1,2}[uU]?)?"><token type="LiteralNumberHex"/></rule>
|
||||
<rule pattern="(-)?0[bB][01](\'?[01])*(([uU][lL]{0,2})|[lL]{1,2}[uU]?)?"><token type="LiteralNumberBin"/></rule>
|
||||
<rule pattern="(-)?0(\'?[0-7])+(([uU][lL]{0,2})|[lL]{1,2}[uU]?)?"><token type="LiteralNumberOct"/></rule>
|
||||
<rule pattern="(-)?\d(\'?\d)*(([uU][lL]{0,2})|[lL]{1,2}[uU]?)?"><token type="LiteralNumberInteger"/></rule>
|
||||
<rule pattern="[~!%^&*+=|?:<>/-]"><token type="Operator"/></rule>
|
||||
<rule pattern="[()\[\],.]"><token type="Punctuation"/></rule>
|
||||
<rule pattern="(true|false|NULL)\b"><token type="NameBuiltin"/></rule>
|
||||
<rule pattern="(?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8})+"><token type="Name"/></rule>
|
||||
</state>
|
||||
<state name="types">
|
||||
<rule pattern="(bit|bool|byte|pid|short|int|unsigned)\b"><token type="KeywordType"/></rule>
|
||||
</state>
|
||||
<state name="keywords">
|
||||
<rule pattern="(atomic|break|d_step|do|od|for|in|goto|if|fi|unless)\b"><token type="Keyword"/></rule>
|
||||
<rule pattern="(assert|get_priority|printf|printm|set_priority)\b"><token type="NameFunction"/></rule>
|
||||
<rule pattern="(c_code|c_decl|c_expr|c_state|c_track)\b"><token type="Keyword"/></rule>
|
||||
<rule pattern="(_|_last|_nr_pr|_pid|_priority|else|np_|STDIN)\b"><token type="NameBuiltin"/></rule>
|
||||
<rule pattern="(empty|enabled|eval|full|len|nempty|nfull|pc_value)\b"><token type="NameFunction"/></rule>
|
||||
<rule pattern="run\b"><token type="OperatorWord"/></rule>
|
||||
<rule pattern="(active|chan|D_proctype|hidden|init|local|mtype|never|notrace|proctype|show|trace|typedef|xr|xs)\b"><token type="KeywordDeclaration"/></rule>
|
||||
<rule pattern="(priority|provided)\b"><token type="Keyword"/></rule>
|
||||
<rule pattern="(inline|ltl|select)\b"><token type="KeywordDeclaration"/></rule>
|
||||
<rule pattern="skip\b"><token type="Keyword"/></rule>
|
||||
</state>
|
||||
<state name="whitespace">
|
||||
<rule pattern="^#if\s+0"><token type="CommentPreproc"/><push state="if0"/></rule>
|
||||
<rule pattern="^#"><token type="CommentPreproc"/><push state="macro"/></rule>
|
||||
<rule pattern="^(\s*(?:/[*].*?[*]/\s*)?)(#if\s+0)"><bygroups><usingself state="root"/><token type="CommentPreproc"/></bygroups><push state="if0"/></rule>
|
||||
<rule pattern="^(\s*(?:/[*].*?[*]/\s*)?)(#)"><bygroups><usingself state="root"/><token type="CommentPreproc"/></bygroups><push state="macro"/></rule>
|
||||
<rule pattern="(^[ \t]*)(?!(?:public|private|protected|default)\b)((?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8})+)(\s*)(:)(?!:)"><bygroups><token type="TextWhitespace"/><token type="NameLabel"/><token type="TextWhitespace"/><token type="Punctuation"/></bygroups></rule>
|
||||
<rule pattern="\n"><token type="TextWhitespace"/></rule>
|
||||
<rule pattern="[^\S\n]+"><token type="TextWhitespace"/></rule>
|
||||
<rule pattern="\\\n"><token type="Text"/></rule>
|
||||
<rule pattern="//(?:.|(?<=\\)\n)*\n"><token type="CommentSingle"/></rule>
|
||||
<rule pattern="/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/"><token type="CommentMultiline"/></rule>
|
||||
<rule pattern="/(\\\n)?[*][\w\W]*"><token type="CommentMultiline"/></rule>
|
||||
</state>
|
||||
<state name="root">
|
||||
<rule><include state="whitespace"/></rule>
|
||||
<rule><include state="keywords"/></rule>
|
||||
<rule pattern="((?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}|::)+(?:[&*\s])+)(\s*(?:(?:(?://(?:.|(?<=\\)\n)*\n)|(?:/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/))\s*)*)((?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}|::)+)(\s*(?:(?:(?://(?:.|(?<=\\)\n)*\n)|(?:/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/))\s*)*)(\([^;"\')]*?\))(\s*(?:(?:(?://(?:.|(?<=\\)\n)*\n)|(?:/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/))\s*)*)([^;{/"\']*)(\{)"><bygroups><usingself state="root"/><usingself state="whitespace"/><token type="NameFunction"/><usingself state="whitespace"/><usingself state="root"/><usingself state="whitespace"/><usingself state="root"/><token type="Punctuation"/></bygroups><push state="function"/></rule>
|
||||
<rule pattern="((?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}|::)+(?:[&*\s])+)(\s*(?:(?:(?://(?:.|(?<=\\)\n)*\n)|(?:/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/))\s*)*)((?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}|::)+)(\s*(?:(?:(?://(?:.|(?<=\\)\n)*\n)|(?:/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/))\s*)*)(\([^;"\')]*?\))(\s*(?:(?:(?://(?:.|(?<=\\)\n)*\n)|(?:/(?:\\\n)?[*](?:[^*]|[*](?!(?:\\\n)?/))*[*](?:\\\n)?/))\s*)*)([^;/"\']*)(;)"><bygroups><usingself state="root"/><usingself state="whitespace"/><token type="NameFunction"/><usingself state="whitespace"/><usingself state="root"/><usingself state="whitespace"/><usingself state="root"/><token type="Punctuation"/></bygroups></rule>
|
||||
<rule><include state="types"/></rule>
|
||||
<rule><push state="statement"/></rule>
|
||||
</state>
|
||||
<state name="statement">
|
||||
<rule><include state="whitespace"/></rule>
|
||||
<rule><include state="statements"/></rule>
|
||||
<rule pattern="\}"><token type="Punctuation"/></rule>
|
||||
<rule pattern="[{;]"><token type="Punctuation"/><pop depth="1"/></rule>
|
||||
</state>
|
||||
<state name="function">
|
||||
<rule><include state="whitespace"/></rule>
|
||||
<rule><include state="statements"/></rule>
|
||||
<rule pattern=";"><token type="Punctuation"/></rule>
|
||||
<rule pattern="\{"><token type="Punctuation"/><push/></rule>
|
||||
<rule pattern="\}"><token type="Punctuation"/><pop depth="1"/></rule>
|
||||
</state>
|
||||
<state name="string">
|
||||
<rule pattern="""><token type="LiteralString"/><pop depth="1"/></rule>
|
||||
<rule pattern="\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|[0-7]{1,3})"><token type="LiteralStringEscape"/></rule>
|
||||
<rule pattern="[^\\"\n]+"><token type="LiteralString"/></rule>
|
||||
<rule pattern="\\\n"><token type="LiteralString"/></rule>
|
||||
<rule pattern="\\"><token type="LiteralString"/></rule>
|
||||
</state>
|
||||
<state name="macro">
|
||||
<rule pattern="(\s*(?:/[*].*?[*]/\s*)?)(include)(\s*(?:/[*].*?[*]/\s*)?)("[^"]+")([^\n]*)"><bygroups><usingself state="root"/><token type="CommentPreproc"/><usingself state="root"/><token type="CommentPreprocFile"/><token type="CommentSingle"/></bygroups></rule>
|
||||
<rule pattern="(\s*(?:/[*].*?[*]/\s*)?)(include)(\s*(?:/[*].*?[*]/\s*)?)(<[^>]+>)([^\n]*)"><bygroups><usingself state="root"/><token type="CommentPreproc"/><usingself state="root"/><token type="CommentPreprocFile"/><token type="CommentSingle"/></bygroups></rule>
|
||||
<rule pattern="[^/\n]+"><token type="CommentPreproc"/></rule>
|
||||
<rule pattern="/[*](.|\n)*?[*]/"><token type="CommentMultiline"/></rule>
|
||||
<rule pattern="//.*?\n"><token type="CommentSingle"/><pop depth="1"/></rule>
|
||||
<rule pattern="/"><token type="CommentPreproc"/></rule>
|
||||
<rule pattern="(?<=\\)\n"><token type="CommentPreproc"/></rule>
|
||||
<rule pattern="\n"><token type="CommentPreproc"/><pop depth="1"/></rule>
|
||||
</state>
|
||||
<state name="if0">
|
||||
<rule pattern="^\s*#if.*?(?<!\\)\n"><token type="CommentPreproc"/><push/></rule>
|
||||
<rule pattern="^\s*#el(?:se|if).*\n"><token type="CommentPreproc"/><pop depth="1"/></rule>
|
||||
<rule pattern="^\s*#endif.*?(?<!\\)\n"><token type="CommentPreproc"/><pop depth="1"/></rule>
|
||||
<rule pattern=".*?\n"><token type="Comment"/></rule>
|
||||
</state>
|
||||
<state name="classname">
|
||||
<rule pattern="(?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8})+"><token type="NameClass"/><pop depth="1"/></rule>
|
||||
<rule pattern="\s*(?=>)"><token type="Text"/><pop depth="1"/></rule>
|
||||
<rule><pop depth="1"/></rule>
|
||||
</state>
|
||||
<state name="case-value">
|
||||
<rule pattern="(?<!:)(:)(?!:)"><token type="Punctuation"/><pop depth="1"/></rule>
|
||||
<rule pattern="(?!\d)(?:[\w$]|\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8})+"><token type="NameConstant"/></rule>
|
||||
<rule><include state="whitespace"/></rule>
|
||||
<rule><include state="statements"/></rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
6
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/python.xml
generated
vendored
6
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/python.xml
generated
vendored
@@ -19,10 +19,6 @@
|
||||
<filename>BUILD</filename>
|
||||
<filename>BUILD.bazel</filename>
|
||||
<filename>WORKSPACE</filename>
|
||||
<filename>WORKSPACE.bzlmod</filename>
|
||||
<filename>WORKSPACE.bazel</filename>
|
||||
<filename>MODULE.bazel</filename>
|
||||
<filename>REPO.bazel</filename>
|
||||
<filename>*.tac</filename>
|
||||
<mime_type>text/x-python</mime_type>
|
||||
<mime_type>application/x-python</mime_type>
|
||||
@@ -590,4 +586,4 @@
|
||||
</rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
</lexer>
|
||||
94
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/rego.xml
generated
vendored
94
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/rego.xml
generated
vendored
@@ -1,94 +0,0 @@
|
||||
<lexer>
|
||||
<config>
|
||||
<name>Rego</name>
|
||||
<alias>rego</alias>
|
||||
<filename>*.rego</filename>
|
||||
</config>
|
||||
<rules>
|
||||
<state name="root">
|
||||
<rule pattern="(package|import|as|not|with|default|else|some|in|if|contains)\b">
|
||||
<token type="KeywordDeclaration"/>
|
||||
</rule>
|
||||
<!-- importing keywords should then show up as keywords -->
|
||||
<rule pattern="(import)( future.keywords.)(\w+)">
|
||||
<bygroups>
|
||||
<token type="KeywordDeclaration"/>
|
||||
<token type="Text"/>
|
||||
<token type="KeywordDeclaration"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="#[^\r\n]*">
|
||||
<token type="Comment"/>
|
||||
</rule>
|
||||
<rule pattern="(FIXME|TODO|XXX)\b( .*)$">
|
||||
<bygroups>
|
||||
<token type="Error"/>
|
||||
<token type="CommentSpecial"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(true|false|null)\b">
|
||||
<token type="KeywordConstant"/>
|
||||
</rule>
|
||||
<rule pattern="\d+i">
|
||||
<token type="LiteralNumber"/>
|
||||
</rule>
|
||||
<rule pattern="\d+\.\d*([Ee][-+]\d+)?i">
|
||||
<token type="LiteralNumber"/>
|
||||
</rule>
|
||||
<rule pattern="\.\d+([Ee][-+]\d+)?i">
|
||||
<token type="LiteralNumber"/>
|
||||
</rule>
|
||||
<rule pattern="\d+[Ee][-+]\d+i">
|
||||
<token type="LiteralNumber"/>
|
||||
</rule>
|
||||
<rule pattern="\d+(\.\d+[eE][+\-]?\d+|\.\d*|[eE][+\-]?\d+)">
|
||||
<token type="LiteralNumberFloat"/>
|
||||
</rule>
|
||||
<rule pattern="\.\d+([eE][+\-]?\d+)?">
|
||||
<token type="LiteralNumberFloat"/>
|
||||
</rule>
|
||||
<rule pattern="(0|[1-9][0-9]*)">
|
||||
<token type="LiteralNumberInteger"/>
|
||||
</rule>
|
||||
<rule pattern="""".*?"""">
|
||||
<token type="LiteralStringDouble"/>
|
||||
</rule>
|
||||
<rule pattern=""(\\\\|\\"|[^"])*"">
|
||||
<token type="LiteralStringDouble"/>
|
||||
</rule>
|
||||
<rule pattern="\$/((?!/\$).)*/\$">
|
||||
<token type="LiteralString"/>
|
||||
</rule>
|
||||
<rule pattern="/(\\\\|\\"|[^/])*/">
|
||||
<token type="LiteralString"/>
|
||||
</rule>
|
||||
<rule pattern="^(\w+)">
|
||||
<token type="Name"/>
|
||||
</rule>
|
||||
<rule pattern="[a-z_-][\w-]*(?=\()">
|
||||
<token type="NameFunction"/>
|
||||
</rule>
|
||||
<rule pattern="[\r\n\s]+">
|
||||
<token type="TextWhitespace"/>
|
||||
</rule>
|
||||
<rule pattern="(package|import)(\s+)">
|
||||
<bygroups>
|
||||
<token type="KeywordDeclaration"/>
|
||||
<token type="Text"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="[=<>!+-/*&|]">
|
||||
<token type="Operator"/>
|
||||
</rule>
|
||||
<rule pattern=":=">
|
||||
<token type="Operator"/>
|
||||
</rule>
|
||||
<rule pattern="[[\]{}():;]+">
|
||||
<token type="Punctuation"/>
|
||||
</rule>
|
||||
<rule pattern="[$a-zA-Z_]\w*">
|
||||
<token type="NameOther"/>
|
||||
</rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
58
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/rpm_spec.xml
generated
vendored
58
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/rpm_spec.xml
generated
vendored
@@ -1,58 +0,0 @@
|
||||
|
||||
<lexer>
|
||||
<config>
|
||||
<name>RPMSpec</name>
|
||||
<alias>spec</alias>
|
||||
<filename>*.spec</filename>
|
||||
<mime_type>text/x-rpm-spec</mime_type>
|
||||
</config>
|
||||
<rules>
|
||||
<state name="root">
|
||||
<rule pattern="#.*$"><token type="Comment"/></rule>
|
||||
<rule><include state="basic"/></rule>
|
||||
</state>
|
||||
<state name="description">
|
||||
<rule pattern="^(%(?:package|prep|build|install|clean|check|pre[a-z]*|post[a-z]*|trigger[a-z]*|files))(.*)$"><bygroups><token type="NameDecorator"/><token type="Text"/></bygroups><pop depth="1"/></rule>
|
||||
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
|
||||
<rule pattern="."><token type="Text"/></rule>
|
||||
</state>
|
||||
<state name="changelog">
|
||||
<rule pattern="\*.*$"><token type="GenericSubheading"/></rule>
|
||||
<rule pattern="^(%(?:package|prep|build|install|clean|check|pre[a-z]*|post[a-z]*|trigger[a-z]*|files))(.*)$"><bygroups><token type="NameDecorator"/><token type="Text"/></bygroups><pop depth="1"/></rule>
|
||||
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
|
||||
<rule pattern="."><token type="Text"/></rule>
|
||||
</state>
|
||||
<state name="string">
|
||||
<rule pattern="""><token type="LiteralStringDouble"/><pop depth="1"/></rule>
|
||||
<rule pattern="\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|[0-7]{1,3})"><token type="LiteralStringEscape"/></rule>
|
||||
<rule><include state="interpol"/></rule>
|
||||
<rule pattern="."><token type="LiteralStringDouble"/></rule>
|
||||
</state>
|
||||
<state name="basic">
|
||||
<rule><include state="macro"/></rule>
|
||||
<rule pattern="(?i)^(Name|Version|Release|Epoch|Summary|Group|License|Packager|Vendor|Icon|URL|Distribution|Prefix|Patch[0-9]*|Source[0-9]*|Requires\(?[a-z]*\)?|[a-z]+Req|Obsoletes|Suggests|Provides|Conflicts|Build[a-z]+|[a-z]+Arch|Auto[a-z]+)(:)(.*)$"><bygroups><token type="GenericHeading"/><token type="Punctuation"/><usingself state="root"/></bygroups></rule>
|
||||
<rule pattern="^%description"><token type="NameDecorator"/><push state="description"/></rule>
|
||||
<rule pattern="^%changelog"><token type="NameDecorator"/><push state="changelog"/></rule>
|
||||
<rule pattern="^(%(?:package|prep|build|install|clean|check|pre[a-z]*|post[a-z]*|trigger[a-z]*|files))(.*)$"><bygroups><token type="NameDecorator"/><token type="Text"/></bygroups></rule>
|
||||
<rule pattern="%(attr|defattr|dir|doc(?:dir)?|setup|config(?:ure)?|make(?:install)|ghost|patch[0-9]+|find_lang|exclude|verify)"><token type="Keyword"/></rule>
|
||||
<rule><include state="interpol"/></rule>
|
||||
<rule pattern="'.*?'"><token type="LiteralStringSingle"/></rule>
|
||||
<rule pattern="""><token type="LiteralStringDouble"/><push state="string"/></rule>
|
||||
<rule pattern="\s+"><token type="TextWhitespace"/></rule>
|
||||
<rule pattern="."><token type="Text"/></rule>
|
||||
</state>
|
||||
<state name="macro">
|
||||
<rule pattern="%define.*$"><token type="CommentPreproc"/></rule>
|
||||
<rule pattern="%\{\!\?.*%define.*\}"><token type="CommentPreproc"/></rule>
|
||||
<rule pattern="(%(?:if(?:n?arch)?|else(?:if)?|endif))(.*)$"><bygroups><token type="CommentPreproc"/><token type="Text"/></bygroups></rule>
|
||||
</state>
|
||||
<state name="interpol">
|
||||
<rule pattern="%\{?__[a-z_]+\}?"><token type="NameFunction"/></rule>
|
||||
<rule pattern="%\{?_([a-z_]+dir|[a-z_]+path|prefix)\}?"><token type="KeywordPseudo"/></rule>
|
||||
<rule pattern="%\{\?\w+\}"><token type="NameVariable"/></rule>
|
||||
<rule pattern="\$\{?RPM_[A-Z0-9_]+\}?"><token type="NameVariableGlobal"/></rule>
|
||||
<rule pattern="%\{[a-zA-Z]\w+\}"><token type="KeywordConstant"/></rule>
|
||||
</state>
|
||||
</rules>
|
||||
</lexer>
|
||||
|
||||
20
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/typescript.xml
generated
vendored
20
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/typescript.xml
generated
vendored
@@ -51,13 +51,10 @@
|
||||
</rule>
|
||||
</state>
|
||||
<state name="tag">
|
||||
<rule>
|
||||
<include state="commentsandwhitespace"/>
|
||||
</rule>
|
||||
<rule pattern="\s+">
|
||||
<token type="Text"/>
|
||||
</rule>
|
||||
<rule pattern="([\w-]+\s*)(=)(\s*)">
|
||||
<rule pattern="([\w]+\s*)(=)(\s*)">
|
||||
<bygroups>
|
||||
<token type="NameAttribute"/>
|
||||
<token type="Operator"/>
|
||||
@@ -80,25 +77,12 @@
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="comment">
|
||||
<rule pattern="[^-]+">
|
||||
<token type="Comment"/>
|
||||
</rule>
|
||||
<rule pattern="-->">
|
||||
<token type="Comment"/>
|
||||
<pop depth="1"/>
|
||||
</rule>
|
||||
<rule pattern="-">
|
||||
<token type="Comment"/>
|
||||
</rule>
|
||||
</state>
|
||||
<state name="commentsandwhitespace">
|
||||
<rule pattern="\s+">
|
||||
<token type="Text"/>
|
||||
</rule>
|
||||
<rule pattern="<!--">
|
||||
<token type="Comment"/>
|
||||
<push state="comment"/>
|
||||
</rule>
|
||||
<rule pattern="//.*?\n">
|
||||
<token type="CommentSingle"/>
|
||||
@@ -216,7 +200,7 @@
|
||||
<rule pattern="(Array|Boolean|Date|Error|Function|Math|Number|Object|Packages|RegExp|String|decodeURI|decodeURIComponent|encodeURI|encodeURIComponent|eval|isFinite|isNaN|parseFloat|parseInt|document|this|window)\b">
|
||||
<token type="NameBuiltin"/>
|
||||
</rule>
|
||||
<rule pattern="\b(module)(\s+)("[\w\./@]+")(\s+)">
|
||||
<rule pattern="\b(module)(\s*)(\s*[\w?.$][\w?.$]*)(\s*)">
|
||||
<bygroups>
|
||||
<token type="KeywordReserved"/>
|
||||
<token type="Text"/>
|
||||
|
||||
10
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/vue.xml
generated
vendored
10
vendor/github.com/alecthomas/chroma/v2/lexers/embedded/vue.xml
generated
vendored
@@ -83,10 +83,9 @@
|
||||
<token type="LiteralString"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(:[\S]+)(=)("[\S]+")">
|
||||
<rule pattern="(:[\S]+)(="[\S]+")">
|
||||
<bygroups>
|
||||
<token type="NameTag"/>
|
||||
<token type="Operator"/>
|
||||
<token type="LiteralString"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
@@ -105,10 +104,9 @@
|
||||
<token type="Punctuation"/>
|
||||
</bygroups>
|
||||
</rule>
|
||||
<rule pattern="(v-[\w]+)(=)("[\S ]+")(>|\s)">
|
||||
<rule pattern="(v-[\w]+)(="[\S]+")(>)">
|
||||
<bygroups>
|
||||
<token type="NameTag"/>
|
||||
<token type="Operator"/>
|
||||
<token type="LiteralString"/>
|
||||
<token type="Punctuation"/>
|
||||
</bygroups>
|
||||
@@ -260,14 +258,14 @@
|
||||
</rule>
|
||||
</state>
|
||||
<state name="vue">
|
||||
<rule pattern="(<)([\w-]+)">
|
||||
<rule pattern="(<)([\w]+)">
|
||||
<bygroups>
|
||||
<token type="Punctuation"/>
|
||||
<token type="NameTag"/>
|
||||
</bygroups>
|
||||
<push state="tag"/>
|
||||
</rule>
|
||||
<rule pattern="(<)(/)([\w-]+)(>)">
|
||||
<rule pattern="(<)(/)([\w]+)(>)">
|
||||
<bygroups>
|
||||
<token type="Punctuation"/>
|
||||
<token type="Punctuation"/>
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/lexers/go.go
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/lexers/go.go
generated
vendored
@@ -55,7 +55,7 @@ func goRules() Rules {
|
||||
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
|
||||
{`(<<=|>>=|<<|>>|<=|>=|&\^=|&\^|\+=|-=|\*=|/=|%=|&=|\|=|&&|\|\||<-|\+\+|--|==|!=|:=|\.\.\.|[+\-*/%&])`, Operator, nil},
|
||||
{`([a-zA-Z_]\w*)(\s*)(\()`, ByGroups(NameFunction, UsingSelf("root"), Punctuation), nil},
|
||||
{`[|^<>=!()\[\]{}.,;:~]`, Punctuation, nil},
|
||||
{`[|^<>=!()\[\]{}.,;:]`, Punctuation, nil},
|
||||
{`[^\W\d]\w*`, NameOther, nil},
|
||||
},
|
||||
}
|
||||
|
||||
25
vendor/github.com/alecthomas/chroma/v2/renovate.json5
generated
vendored
25
vendor/github.com/alecthomas/chroma/v2/renovate.json5
generated
vendored
@@ -1,18 +1,11 @@
|
||||
{
|
||||
$schema: "https://docs.renovatebot.com/renovate-schema.json",
|
||||
extends: [
|
||||
"config:recommended",
|
||||
":semanticCommits",
|
||||
":semanticCommitTypeAll(chore)",
|
||||
":semanticCommitScope(deps)",
|
||||
"group:allNonMajor",
|
||||
"schedule:earlyMondays", // Run once a week.
|
||||
],
|
||||
packageRules: [
|
||||
{
|
||||
matchPackageNames: ["golangci-lint"],
|
||||
matchManagers: ["hermit"],
|
||||
enabled: false,
|
||||
},
|
||||
],
|
||||
$schema: "https://docs.renovatebot.com/renovate-schema.json",
|
||||
extends: [
|
||||
"config:recommended",
|
||||
":semanticCommits",
|
||||
":semanticCommitTypeAll(chore)",
|
||||
":semanticCommitScope(deps)",
|
||||
"group:allNonMajor",
|
||||
"schedule:earlyMondays", // Run once a week.
|
||||
],
|
||||
}
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-frappe.xml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-frappe.xml
generated
vendored
@@ -5,7 +5,7 @@
|
||||
<entry type="Other" style="#c6d0f5"/>
|
||||
<entry type="LineTableTD" style=""/>
|
||||
<entry type="LineTable" style=""/>
|
||||
<entry type="LineHighlight" style="bg:#51576d"/>
|
||||
<entry type="LineHighlight" style="#51576d"/>
|
||||
<entry type="LineNumbersTable" style="#838ba7"/>
|
||||
<entry type="LineNumbers" style="#838ba7"/>
|
||||
<entry type="Keyword" style="#ca9ee6"/>
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-latte.xml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-latte.xml
generated
vendored
@@ -5,7 +5,7 @@
|
||||
<entry type="Other" style="#4c4f69"/>
|
||||
<entry type="LineTableTD" style=""/>
|
||||
<entry type="LineTable" style=""/>
|
||||
<entry type="LineHighlight" style="bg:#bcc0cc"/>
|
||||
<entry type="LineHighlight" style="#bcc0cc"/>
|
||||
<entry type="LineNumbersTable" style="#8c8fa1"/>
|
||||
<entry type="LineNumbers" style="#8c8fa1"/>
|
||||
<entry type="Keyword" style="#8839ef"/>
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-macchiato.xml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-macchiato.xml
generated
vendored
@@ -5,7 +5,7 @@
|
||||
<entry type="Other" style="#cad3f5"/>
|
||||
<entry type="LineTableTD" style=""/>
|
||||
<entry type="LineTable" style=""/>
|
||||
<entry type="LineHighlight" style="bg:#494d64"/>
|
||||
<entry type="LineHighlight" style="#494d64"/>
|
||||
<entry type="LineNumbersTable" style="#8087a2"/>
|
||||
<entry type="LineNumbers" style="#8087a2"/>
|
||||
<entry type="Keyword" style="#c6a0f6"/>
|
||||
|
||||
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-mocha.xml
generated
vendored
2
vendor/github.com/alecthomas/chroma/v2/styles/catppuccin-mocha.xml
generated
vendored
@@ -5,7 +5,7 @@
|
||||
<entry type="Other" style="#cdd6f4"/>
|
||||
<entry type="LineTableTD" style=""/>
|
||||
<entry type="LineTable" style=""/>
|
||||
<entry type="LineHighlight" style="bg:#45475a"/>
|
||||
<entry type="LineHighlight" style="#45475a"/>
|
||||
<entry type="LineNumbersTable" style="#7f849c"/>
|
||||
<entry type="LineNumbers" style="#7f849c"/>
|
||||
<entry type="Keyword" style="#cba6f7"/>
|
||||
|
||||
4
vendor/github.com/alecthomas/chroma/v2/styles/github-dark.xml
generated
vendored
4
vendor/github.com/alecthomas/chroma/v2/styles/github-dark.xml
generated
vendored
@@ -1,6 +1,6 @@
|
||||
<style name="github-dark">
|
||||
<entry type="Error" style="#f85149"/>
|
||||
<entry type="LineHighlight" style="bg:#6e7681"/>
|
||||
<entry type="LineHighlight" style="#6e7681"/>
|
||||
<entry type="LineNumbers" style="#6e7681"/>
|
||||
<entry type="Background" style="#e6edf3 bg:#0d1117"/>
|
||||
<entry type="Keyword" style="#ff7b72"/>
|
||||
@@ -42,4 +42,4 @@
|
||||
<entry type="GenericTraceback" style="#ff7b72"/>
|
||||
<entry type="GenericUnderline" style="underline"/>
|
||||
<entry type="TextWhitespace" style="#6e7681"/>
|
||||
</style>
|
||||
</style>
|
||||
|
||||
83
vendor/github.com/alecthomas/chroma/v2/styles/tokyonight-day.xml
generated
vendored
83
vendor/github.com/alecthomas/chroma/v2/styles/tokyonight-day.xml
generated
vendored
@@ -1,83 +0,0 @@
|
||||
<style name="tokyonight-day">
|
||||
<entry type="Background" style="bg:#e1e2e7 #3760bf"/>
|
||||
<entry type="CodeLine" style="#3760bf"/>
|
||||
<entry type="Error" style="#c64343"/>
|
||||
<entry type="Other" style="#3760bf"/>
|
||||
<entry type="LineTableTD" style=""/>
|
||||
<entry type="LineTable" style=""/>
|
||||
<entry type="LineHighlight" style="bg:#a1a6c5"/>
|
||||
<entry type="LineNumbersTable" style="#6172b0"/>
|
||||
<entry type="LineNumbers" style="#6172b0"/>
|
||||
<entry type="Keyword" style="#9854f1"/>
|
||||
<entry type="KeywordReserved" style="#9854f1"/>
|
||||
<entry type="KeywordPseudo" style="#9854f1"/>
|
||||
<entry type="KeywordConstant" style="#8c6c3e"/>
|
||||
<entry type="KeywordDeclaration" style="#9d7cd8"/>
|
||||
<entry type="KeywordNamespace" style="#007197"/>
|
||||
<entry type="KeywordType" style="#0db9d7"/>
|
||||
<entry type="Name" style="#3760bf"/>
|
||||
<entry type="NameClass" style="#b15c00"/>
|
||||
<entry type="NameConstant" style="#b15c00"/>
|
||||
<entry type="NameDecorator" style="bold #2e7de9"/>
|
||||
<entry type="NameEntity" style="#007197"/>
|
||||
<entry type="NameException" style="#8c6c3e"/>
|
||||
<entry type="NameFunction" style="#2e7de9"/>
|
||||
<entry type="NameFunctionMagic" style="#2e7de9"/>
|
||||
<entry type="NameLabel" style="#587539"/>
|
||||
<entry type="NameNamespace" style="#8c6c3e"/>
|
||||
<entry type="NameProperty" style="#8c6c3e"/>
|
||||
<entry type="NameTag" style="#9854f1"/>
|
||||
<entry type="NameVariable" style="#3760bf"/>
|
||||
<entry type="NameVariableClass" style="#3760bf"/>
|
||||
<entry type="NameVariableGlobal" style="#3760bf"/>
|
||||
<entry type="NameVariableInstance" style="#3760bf"/>
|
||||
<entry type="NameVariableMagic" style="#3760bf"/>
|
||||
<entry type="NameAttribute" style="#2e7de9"/>
|
||||
<entry type="NameBuiltin" style="#587539"/>
|
||||
<entry type="NameBuiltinPseudo" style="#587539"/>
|
||||
<entry type="NameOther" style="#3760bf"/>
|
||||
<entry type="Literal" style="#3760bf"/>
|
||||
<entry type="LiteralDate" style="#3760bf"/>
|
||||
<entry type="LiteralString" style="#587539"/>
|
||||
<entry type="LiteralStringChar" style="#587539"/>
|
||||
<entry type="LiteralStringSingle" style="#587539"/>
|
||||
<entry type="LiteralStringDouble" style="#587539"/>
|
||||
<entry type="LiteralStringBacktick" style="#587539"/>
|
||||
<entry type="LiteralStringOther" style="#587539"/>
|
||||
<entry type="LiteralStringSymbol" style="#587539"/>
|
||||
<entry type="LiteralStringInterpol" style="#587539"/>
|
||||
<entry type="LiteralStringAffix" style="#9d7cd8"/>
|
||||
<entry type="LiteralStringDelimiter" style="#2e7de9"/>
|
||||
<entry type="LiteralStringEscape" style="#2e7de9"/>
|
||||
<entry type="LiteralStringRegex" style="#007197"/>
|
||||
<entry type="LiteralStringDoc" style="#a1a6c5"/>
|
||||
<entry type="LiteralStringHeredoc" style="#a1a6c5"/>
|
||||
<entry type="LiteralNumber" style="#8c6c3e"/>
|
||||
<entry type="LiteralNumberBin" style="#8c6c3e"/>
|
||||
<entry type="LiteralNumberHex" style="#8c6c3e"/>
|
||||
<entry type="LiteralNumberInteger" style="#8c6c3e"/>
|
||||
<entry type="LiteralNumberFloat" style="#8c6c3e"/>
|
||||
<entry type="LiteralNumberIntegerLong" style="#8c6c3e"/>
|
||||
<entry type="LiteralNumberOct" style="#8c6c3e"/>
|
||||
<entry type="Operator" style="bold #587539"/>
|
||||
<entry type="OperatorWord" style="bold #587539"/>
|
||||
<entry type="Comment" style="italic #a1a6c5"/>
|
||||
<entry type="CommentSingle" style="italic #a1a6c5"/>
|
||||
<entry type="CommentMultiline" style="italic #a1a6c5"/>
|
||||
<entry type="CommentSpecial" style="italic #a1a6c5"/>
|
||||
<entry type="CommentHashbang" style="italic #a1a6c5"/>
|
||||
<entry type="CommentPreproc" style="italic #a1a6c5"/>
|
||||
<entry type="CommentPreprocFile" style="bold #a1a6c5"/>
|
||||
<entry type="Generic" style="#3760bf"/>
|
||||
<entry type="GenericInserted" style="bg:#e9e9ed #587539"/>
|
||||
<entry type="GenericDeleted" style="#c64343 bg:#e9e9ed"/>
|
||||
<entry type="GenericEmph" style="italic #3760bf"/>
|
||||
<entry type="GenericStrong" style="bold #3760bf"/>
|
||||
<entry type="GenericUnderline" style="underline #3760bf"/>
|
||||
<entry type="GenericHeading" style="bold #8c6c3e"/>
|
||||
<entry type="GenericSubheading" style="bold #8c6c3e"/>
|
||||
<entry type="GenericOutput" style="#3760bf"/>
|
||||
<entry type="GenericPrompt" style="#3760bf"/>
|
||||
<entry type="GenericError" style="#c64343"/>
|
||||
<entry type="GenericTraceback" style="#c64343"/>
|
||||
</style>
|
||||
83
vendor/github.com/alecthomas/chroma/v2/styles/tokyonight-moon.xml
generated
vendored
83
vendor/github.com/alecthomas/chroma/v2/styles/tokyonight-moon.xml
generated
vendored
@@ -1,83 +0,0 @@
|
||||
<style name="tokyonight-moon">
|
||||
<entry type="Background" style="bg:#222436 #c8d3f5"/>
|
||||
<entry type="CodeLine" style="#c8d3f5"/>
|
||||
<entry type="Error" style="#c53b53"/>
|
||||
<entry type="Other" style="#c8d3f5"/>
|
||||
<entry type="LineTableTD" style=""/>
|
||||
<entry type="LineTable" style=""/>
|
||||
<entry type="LineHighlight" style="bg:#444a73"/>
|
||||
<entry type="LineNumbersTable" style="#828bb8"/>
|
||||
<entry type="LineNumbers" style="#828bb8"/>
|
||||
<entry type="Keyword" style="#c099ff"/>
|
||||
<entry type="KeywordReserved" style="#c099ff"/>
|
||||
<entry type="KeywordPseudo" style="#c099ff"/>
|
||||
<entry type="KeywordConstant" style="#ffc777"/>
|
||||
<entry type="KeywordDeclaration" style="#c099ff"/>
|
||||
<entry type="KeywordNamespace" style="#86e1fc"/>
|
||||
<entry type="KeywordType" style="#4fd6be"/>
|
||||
<entry type="Name" style="#c8d3f5"/>
|
||||
<entry type="NameClass" style="#ff966c"/>
|
||||
<entry type="NameConstant" style="#ff966c"/>
|
||||
<entry type="NameDecorator" style="bold #82aaff"/>
|
||||
<entry type="NameEntity" style="#86e1fc"/>
|
||||
<entry type="NameException" style="#ffc777"/>
|
||||
<entry type="NameFunction" style="#82aaff"/>
|
||||
<entry type="NameFunctionMagic" style="#82aaff"/>
|
||||
<entry type="NameLabel" style="#c3e88d"/>
|
||||
<entry type="NameNamespace" style="#ffc777"/>
|
||||
<entry type="NameProperty" style="#ffc777"/>
|
||||
<entry type="NameTag" style="#c099ff"/>
|
||||
<entry type="NameVariable" style="#c8d3f5"/>
|
||||
<entry type="NameVariableClass" style="#c8d3f5"/>
|
||||
<entry type="NameVariableGlobal" style="#c8d3f5"/>
|
||||
<entry type="NameVariableInstance" style="#c8d3f5"/>
|
||||
<entry type="NameVariableMagic" style="#c8d3f5"/>
|
||||
<entry type="NameAttribute" style="#82aaff"/>
|
||||
<entry type="NameBuiltin" style="#c3e88d"/>
|
||||
<entry type="NameBuiltinPseudo" style="#c3e88d"/>
|
||||
<entry type="NameOther" style="#c8d3f5"/>
|
||||
<entry type="Literal" style="#c8d3f5"/>
|
||||
<entry type="LiteralDate" style="#c8d3f5"/>
|
||||
<entry type="LiteralString" style="#c3e88d"/>
|
||||
<entry type="LiteralStringChar" style="#c3e88d"/>
|
||||
<entry type="LiteralStringSingle" style="#c3e88d"/>
|
||||
<entry type="LiteralStringDouble" style="#c3e88d"/>
|
||||
<entry type="LiteralStringBacktick" style="#c3e88d"/>
|
||||
<entry type="LiteralStringOther" style="#c3e88d"/>
|
||||
<entry type="LiteralStringSymbol" style="#c3e88d"/>
|
||||
<entry type="LiteralStringInterpol" style="#c3e88d"/>
|
||||
<entry type="LiteralStringAffix" style="#c099ff"/>
|
||||
<entry type="LiteralStringDelimiter" style="#82aaff"/>
|
||||
<entry type="LiteralStringEscape" style="#82aaff"/>
|
||||
<entry type="LiteralStringRegex" style="#86e1fc"/>
|
||||
<entry type="LiteralStringDoc" style="#444a73"/>
|
||||
<entry type="LiteralStringHeredoc" style="#444a73"/>
|
||||
<entry type="LiteralNumber" style="#ffc777"/>
|
||||
<entry type="LiteralNumberBin" style="#ffc777"/>
|
||||
<entry type="LiteralNumberHex" style="#ffc777"/>
|
||||
<entry type="LiteralNumberInteger" style="#ffc777"/>
|
||||
<entry type="LiteralNumberFloat" style="#ffc777"/>
|
||||
<entry type="LiteralNumberIntegerLong" style="#ffc777"/>
|
||||
<entry type="LiteralNumberOct" style="#ffc777"/>
|
||||
<entry type="Operator" style="bold #c3e88d"/>
|
||||
<entry type="OperatorWord" style="bold #c3e88d"/>
|
||||
<entry type="Comment" style="italic #444a73"/>
|
||||
<entry type="CommentSingle" style="italic #444a73"/>
|
||||
<entry type="CommentMultiline" style="italic #444a73"/>
|
||||
<entry type="CommentSpecial" style="italic #444a73"/>
|
||||
<entry type="CommentHashbang" style="italic #444a73"/>
|
||||
<entry type="CommentPreproc" style="italic #444a73"/>
|
||||
<entry type="CommentPreprocFile" style="bold #444a73"/>
|
||||
<entry type="Generic" style="#c8d3f5"/>
|
||||
<entry type="GenericInserted" style="bg:#1b1d2b #c3e88d"/>
|
||||
<entry type="GenericDeleted" style="#c53b53 bg:#1b1d2b"/>
|
||||
<entry type="GenericEmph" style="italic #c8d3f5"/>
|
||||
<entry type="GenericStrong" style="bold #c8d3f5"/>
|
||||
<entry type="GenericUnderline" style="underline #c8d3f5"/>
|
||||
<entry type="GenericHeading" style="bold #ffc777"/>
|
||||
<entry type="GenericSubheading" style="bold #ffc777"/>
|
||||
<entry type="GenericOutput" style="#c8d3f5"/>
|
||||
<entry type="GenericPrompt" style="#c8d3f5"/>
|
||||
<entry type="GenericError" style="#c53b53"/>
|
||||
<entry type="GenericTraceback" style="#c53b53"/>
|
||||
</style>
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user