Rename "babylon" to "@babel/parser" (#7937) 🎉
This commit is contained in:
committed by
Henry Zhu
parent
0200a3e510
commit
daf0ca8680
41
packages/babel-parser/AUTHORS
Normal file
41
packages/babel-parser/AUTHORS
Normal file
@@ -0,0 +1,41 @@
|
||||
List of Acorn contributors. Updated before every release.
|
||||
|
||||
Adrian Rakovsky
|
||||
Alistair Braidwood
|
||||
Andres Suarez
|
||||
Aparajita Fishman
|
||||
Arian Stolwijk
|
||||
Artem Govorov
|
||||
Brandon Mills
|
||||
Charles Hughes
|
||||
Conrad Irwin
|
||||
David Bonnet
|
||||
Forbes Lindesay
|
||||
Gilad Peleg
|
||||
impinball
|
||||
Ingvar Stepanyan
|
||||
Jesse McCarthy
|
||||
Jiaxing Wang
|
||||
Joel Kemp
|
||||
Johannes Herr
|
||||
Jürg Lehni
|
||||
keeyipchan
|
||||
Kevin Kwok
|
||||
krator
|
||||
Marijn Haverbeke
|
||||
Martin Carlberg
|
||||
Mathias Bynens
|
||||
Mathieu 'p01' Henri
|
||||
Max Schaefer
|
||||
Max Zerzouri
|
||||
Mihai Bazon
|
||||
Mike Rennie
|
||||
Nick Fitzgerald
|
||||
Oskar Schöldström
|
||||
Paul Harper
|
||||
Peter Rust
|
||||
PlNG
|
||||
r-e-d
|
||||
Rich Harris
|
||||
Sebastian McKenzie
|
||||
zsjforcn
|
||||
1073
packages/babel-parser/CHANGELOG.md
Normal file
1073
packages/babel-parser/CHANGELOG.md
Normal file
File diff suppressed because it is too large
Load Diff
19
packages/babel-parser/LICENSE
Normal file
19
packages/babel-parser/LICENSE
Normal file
@@ -0,0 +1,19 @@
|
||||
Copyright (C) 2012-2014 by various contributors (see AUTHORS)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
167
packages/babel-parser/README.md
Normal file
167
packages/babel-parser/README.md
Normal file
@@ -0,0 +1,167 @@
|
||||
<p align="center">
|
||||
<img alt="@babel/parser" src="https://raw.githubusercontent.com/babel/logo/master/babylon.png" width="700">
|
||||
</p>
|
||||
|
||||
<p align="center">
|
||||
The Babel parser (previously Babylon) is a JavaScript parser used in <a href="https://github.com/babel/babel">Babel</a>.
|
||||
</p>
|
||||
|
||||
- The latest ECMAScript version enabled by default (ES2017).
|
||||
- Comment attachment.
|
||||
- Support for JSX, Flow, Typescript.
|
||||
- Support for experimental language proposals (accepting PRs for anything at least [stage-0](https://github.com/tc39/proposals/blob/master/stage-0-proposals.md)).
|
||||
|
||||
## Credits
|
||||
|
||||
Heavily based on [acorn](https://github.com/marijnh/acorn) and [acorn-jsx](https://github.com/RReverser/acorn-jsx),
|
||||
thanks to the awesome work of [@RReverser](https://github.com/RReverser) and [@marijnh](https://github.com/marijnh).
|
||||
|
||||
## API
|
||||
|
||||
### `babelParser.parse(code, [options])`
|
||||
|
||||
### `babelParser.parseExpression(code, [options])`
|
||||
|
||||
`parse()` parses the provided `code` as an entire ECMAScript program, while
|
||||
`parseExpression()` tries to parse a single Expression with performance in
|
||||
mind. When in doubt, use `.parse()`.
|
||||
|
||||
### Options
|
||||
|
||||
- **allowImportExportEverywhere**: By default, `import` and `export`
|
||||
declarations can only appear at a program's top level. Setting this
|
||||
option to `true` allows them anywhere where a statement is allowed.
|
||||
|
||||
- **allowAwaitOutsideFunction**: By default, `await` use is not allowed
|
||||
outside of an async function. Set this to `true` to accept such
|
||||
code.
|
||||
|
||||
- **allowReturnOutsideFunction**: By default, a return statement at
|
||||
the top level raises an error. Set this to `true` to accept such
|
||||
code.
|
||||
|
||||
- **allowSuperOutsideMethod**: TODO
|
||||
|
||||
- **sourceType**: Indicate the mode the code should be parsed in. Can be
|
||||
one of `"script"`, `"module"`, or `"unambiguous"`. Defaults to `"script"`. `"unambiguous"` will make @babel/parser attempt to _guess_, based on the presence of ES6 `import` or `export` statements. Files with ES6 `import`s and `export`s are considered `"module"` and are otherwise `"script"`.
|
||||
|
||||
- **sourceFilename**: Correlate output AST nodes with their source filename. Useful when generating code and source maps from the ASTs of multiple input files.
|
||||
|
||||
- **startLine**: By default, the first line of code parsed is treated as line 1. You can provide a line number to alternatively start with. Useful for integration with other source tools.
|
||||
|
||||
- **plugins**: Array containing the plugins that you want to enable.
|
||||
|
||||
- **strictMode**: TODO
|
||||
|
||||
- **ranges**: Adds a `ranges` property to each node: `[node.start, node.end]`
|
||||
|
||||
- **tokens**: Adds all parsed tokens to a `tokens` property on the `File` node
|
||||
|
||||
### Output
|
||||
|
||||
The Babel parser generates AST according to [Babel AST format][].
|
||||
It is based on [ESTree spec][] with the following deviations:
|
||||
|
||||
> There is now an `estree` plugin which reverts these deviations
|
||||
|
||||
- [Literal][] token is replaced with [StringLiteral][], [NumericLiteral][], [BooleanLiteral][], [NullLiteral][], [RegExpLiteral][]
|
||||
- [Property][] token is replaced with [ObjectProperty][] and [ObjectMethod][]
|
||||
- [MethodDefinition][] is replaced with [ClassMethod][]
|
||||
- [Program][] and [BlockStatement][] contain additional `directives` field with [Directive][] and [DirectiveLiteral][]
|
||||
- [ClassMethod][], [ObjectProperty][], and [ObjectMethod][] value property's properties in [FunctionExpression][] is coerced/brought into the main method node.
|
||||
|
||||
AST for JSX code is based on [Facebook JSX AST][].
|
||||
|
||||
[Babel AST format]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md
|
||||
[ESTree spec]: https://github.com/estree/estree
|
||||
|
||||
[Literal]: https://github.com/estree/estree/blob/master/es5.md#literal
|
||||
[Property]: https://github.com/estree/estree/blob/master/es5.md#property
|
||||
[MethodDefinition]: https://github.com/estree/estree/blob/master/es2015.md#methoddefinition
|
||||
|
||||
[StringLiteral]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#stringliteral
|
||||
[NumericLiteral]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#numericliteral
|
||||
[BooleanLiteral]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#booleanliteral
|
||||
[NullLiteral]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#nullliteral
|
||||
[RegExpLiteral]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#regexpliteral
|
||||
[ObjectProperty]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#objectproperty
|
||||
[ObjectMethod]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#objectmethod
|
||||
[ClassMethod]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#classmethod
|
||||
[Program]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#programs
|
||||
[BlockStatement]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#blockstatement
|
||||
[Directive]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#directive
|
||||
[DirectiveLiteral]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#directiveliteral
|
||||
[FunctionExpression]: https://github.com/babel/babel/tree/master/packages/babel-parser/ast/spec.md#functionexpression
|
||||
|
||||
[Facebook JSX AST]: https://github.com/facebook/jsx/blob/master/AST.md
|
||||
|
||||
### Semver
|
||||
|
||||
The Bael Parser follows semver in most situations. The only thing to note is that some spec-compliancy bug fixes may be released under patch versions.
|
||||
|
||||
For example: We push a fix to early error on something like [#107](https://github.com/babel/babylon/pull/107) - multiple default exports per file. That would be considered a bug fix even though it would cause a build to fail.
|
||||
|
||||
### Example
|
||||
|
||||
```javascript
|
||||
require("@babel/parser").parse("code", {
|
||||
// parse in strict mode and allow module declarations
|
||||
sourceType: "module",
|
||||
|
||||
plugins: [
|
||||
// enable jsx and flow syntax
|
||||
"jsx",
|
||||
"flow"
|
||||
]
|
||||
});
|
||||
```
|
||||
|
||||
### Plugins
|
||||
|
||||
| Name | Code Example |
|
||||
|------|--------------|
|
||||
| `estree` ([repo](https://github.com/estree/estree)) | n/a |
|
||||
| `jsx` ([repo](https://facebook.github.io/jsx/)) | `<a attr="b">{s}</a>` |
|
||||
| `flow` ([repo](https://github.com/facebook/flow)) | `var a: string = "";` |
|
||||
| `flowComments` ([docs](https://flow.org/en/docs/types/comments/)) | `/*:: type Foo = {...}; */` |
|
||||
| `typescript` ([repo](https://github.com/Microsoft/TypeScript)) | `var a: string = "";` |
|
||||
| `doExpressions` | `var a = do { if (true) { 'hi'; } };` |
|
||||
| `objectRestSpread` ([proposal](https://github.com/tc39/proposal-object-rest-spread)) | `var a = { b, ...c };` |
|
||||
| `decorators` (Stage 1) and `decorators2` (Stage 2 [proposal](https://github.com/tc39/proposal-decorators)) | `@a class A {}` |
|
||||
| `classProperties` ([proposal](https://github.com/tc39/proposal-class-public-fields)) | `class A { b = 1; }` |
|
||||
| `classPrivateProperties` ([proposal](https://github.com/tc39/proposal-private-fields)) | `class A { #b = 1; }` |
|
||||
| `classPrivateMethods` ([proposal](https://github.com/tc39/proposal-private-methods)) | `class A { #c() {} }` |
|
||||
| `exportDefaultFrom` ([proposal](https://github.com/leebyron/ecmascript-export-default-from)) | `export v from "mod"` |
|
||||
| `exportNamespaceFrom` ([proposal](https://github.com/leebyron/ecmascript-export-ns-from)) | `export * as ns from "mod"` |
|
||||
| `asyncGenerators` ([proposal](https://github.com/tc39/proposal-async-iteration)) | `async function*() {}`, `for await (let a of b) {}` |
|
||||
| `functionBind` ([proposal](https://github.com/zenparsing/es-function-bind)) | `a::b`, `::console.log` |
|
||||
| `functionSent` | `function.sent` |
|
||||
| `dynamicImport` ([proposal](https://github.com/tc39/proposal-dynamic-import)) | `import('./guy').then(a)` |
|
||||
| `numericSeparator` ([proposal](https://github.com/samuelgoto/proposal-numeric-separator)) | `1_000_000` |
|
||||
| `optionalChaining` ([proposal](https://github.com/tc39/proposal-optional-chaining)) | `a?.b` |
|
||||
| `importMeta` ([proposal](https://github.com/tc39/proposal-import-meta)) | `import.meta.url` |
|
||||
| `bigInt` ([proposal](https://github.com/tc39/proposal-bigint)) | `100n` |
|
||||
| `optionalCatchBinding` ([proposal](https://github.com/babel/proposals/issues/7)) | `try {throw 0;} catch{do();}` |
|
||||
| `throwExpressions` ([proposal](https://github.com/babel/proposals/issues/23)) | `() => throw new Error("")` |
|
||||
| `pipelineOperator` ([proposal](https://github.com/babel/proposals/issues/29)) | `a \|> b` |
|
||||
| `nullishCoalescingOperator` ([proposal](https://github.com/babel/proposals/issues/14)) | `a ?? b` |
|
||||
|
||||
### FAQ
|
||||
|
||||
#### Will the babel parser support a plugin system?
|
||||
|
||||
Previous issues: [#1351](https://github.com/babel/babel/issues/1351), [#6694](https://github.com/babel/babel/issues/6694).
|
||||
|
||||
We currently aren't willing to commit to supporting the API for plugins or the resulting ecosystem (there is already enough work maintaining Babel's own plugin system). It's not clear how to make that API effective, and it would limit our ability to refactor and optimize the codebase.
|
||||
|
||||
Our current recommendation for those that want to create their own custom syntax is for users to fork babel parser.
|
||||
|
||||
To consume your custom parser, you can add to your `.babelrc` via its npm package name or require it if using JavaScript,
|
||||
|
||||
```json
|
||||
{
|
||||
"parserOpts": {
|
||||
"parser": "custom-fork-of-babel-parser-on-npm-here"
|
||||
}
|
||||
}
|
||||
```
|
||||
0
packages/babel-parser/ast/flow.md
Normal file
0
packages/babel-parser/ast/flow.md
Normal file
0
packages/babel-parser/ast/jsx.md
Normal file
0
packages/babel-parser/ast/jsx.md
Normal file
1236
packages/babel-parser/ast/spec.md
Normal file
1236
packages/babel-parser/ast/spec.md
Normal file
File diff suppressed because it is too large
Load Diff
16
packages/babel-parser/bin/babel-parser.js
Executable file
16
packages/babel-parser/bin/babel-parser.js
Executable file
@@ -0,0 +1,16 @@
|
||||
#!/usr/bin/env node
|
||||
/* eslint no-var: 0 */
|
||||
|
||||
var parser = require("..");
|
||||
var fs = require("fs");
|
||||
|
||||
var filename = process.argv[2];
|
||||
if (!filename) {
|
||||
console.error("no filename specified");
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
var file = fs.readFileSync(filename, "utf8");
|
||||
var ast = parser.parse(file);
|
||||
|
||||
console.log(JSON.stringify(ast, null, " "));
|
||||
36
packages/babel-parser/package.json
Normal file
36
packages/babel-parser/package.json
Normal file
@@ -0,0 +1,36 @@
|
||||
{
|
||||
"name": "@babel/parser",
|
||||
"version": "7.0.0-beta.47",
|
||||
"description": "A JavaScript parser",
|
||||
"author": "Sebastian McKenzie <sebmck@gmail.com>",
|
||||
"homepage": "https://babeljs.io/",
|
||||
"license": "MIT",
|
||||
"keywords": [
|
||||
"babel",
|
||||
"javascript",
|
||||
"parser",
|
||||
"tc39",
|
||||
"ecmascript",
|
||||
"@babel/parser"
|
||||
],
|
||||
"repository": "https://github.com/babel/babel/tree/master/packages/babel-parser",
|
||||
"main": "lib/index.js",
|
||||
"files": [
|
||||
"bin",
|
||||
"lib"
|
||||
],
|
||||
"engines": {
|
||||
"node": ">=6.0.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@babel/helper-fixtures": "7.0.0-beta.47",
|
||||
"charcodes": "0.1.0",
|
||||
"unicode-10.0.0": "^0.7.4"
|
||||
},
|
||||
"bin": {
|
||||
"parser": "./bin/babel-parser.js"
|
||||
},
|
||||
"publishConfig": {
|
||||
"tag": "next"
|
||||
}
|
||||
}
|
||||
70
packages/babel-parser/scripts/generate-identifier-regex.js
Normal file
70
packages/babel-parser/scripts/generate-identifier-regex.js
Normal file
@@ -0,0 +1,70 @@
|
||||
"use strict";
|
||||
|
||||
// Which Unicode version should be used?
|
||||
const version = "10.0.0";
|
||||
|
||||
const start = require("unicode-" +
|
||||
version +
|
||||
"/Binary_Property/ID_Start/code-points.js").filter(function(ch) {
|
||||
return ch > 0x7f;
|
||||
});
|
||||
let last = -1;
|
||||
const cont = [0x200c, 0x200d].concat(
|
||||
require("unicode-" +
|
||||
version +
|
||||
"/Binary_Property/ID_Continue/code-points.js").filter(function(ch) {
|
||||
return ch > 0x7f && search(start, ch, last + 1) == -1;
|
||||
})
|
||||
);
|
||||
|
||||
function search(arr, ch, starting) {
|
||||
for (let i = starting; arr[i] <= ch && i < arr.length; last = i++) {
|
||||
if (arr[i] === ch) return i;
|
||||
}
|
||||
return -1;
|
||||
}
|
||||
|
||||
function pad(str, width) {
|
||||
while (str.length < width) str = "0" + str;
|
||||
return str;
|
||||
}
|
||||
|
||||
function esc(code) {
|
||||
const hex = code.toString(16);
|
||||
if (hex.length <= 2) return "\\x" + pad(hex, 2);
|
||||
else return "\\u" + pad(hex, 4);
|
||||
}
|
||||
|
||||
function generate(chars) {
|
||||
const astral = [];
|
||||
let re = "";
|
||||
for (let i = 0, at = 0x10000; i < chars.length; i++) {
|
||||
const from = chars[i];
|
||||
let to = from;
|
||||
while (i < chars.length - 1 && chars[i + 1] == to + 1) {
|
||||
i++;
|
||||
to++;
|
||||
}
|
||||
if (to <= 0xffff) {
|
||||
if (from == to) re += esc(from);
|
||||
else if (from + 1 == to) re += esc(from) + esc(to);
|
||||
else re += esc(from) + "-" + esc(to);
|
||||
} else {
|
||||
astral.push(from - at, to - from);
|
||||
at = to;
|
||||
}
|
||||
}
|
||||
return { nonASCII: re, astral: astral };
|
||||
}
|
||||
|
||||
const startData = generate(start);
|
||||
const contData = generate(cont);
|
||||
|
||||
console.log('let nonASCIIidentifierStartChars = "' + startData.nonASCII + '";');
|
||||
console.log('let nonASCIIidentifierChars = "' + contData.nonASCII + '";');
|
||||
console.log(
|
||||
"const astralIdentifierStartCodes = " + JSON.stringify(startData.astral) + ";"
|
||||
);
|
||||
console.log(
|
||||
"const astralIdentifierCodes = " + JSON.stringify(contData.astral) + ";"
|
||||
);
|
||||
124
packages/babel-parser/src/index.js
Executable file
124
packages/babel-parser/src/index.js
Executable file
@@ -0,0 +1,124 @@
|
||||
// @flow
|
||||
|
||||
import type { Options } from "./options";
|
||||
import Parser, { plugins } from "./parser";
|
||||
|
||||
import { types as tokTypes } from "./tokenizer/types";
|
||||
import "./tokenizer/context";
|
||||
|
||||
import type { Expression, File } from "./types";
|
||||
|
||||
import estreePlugin from "./plugins/estree";
|
||||
import flowPlugin from "./plugins/flow";
|
||||
import jsxPlugin from "./plugins/jsx";
|
||||
import typescriptPlugin from "./plugins/typescript";
|
||||
plugins.estree = estreePlugin;
|
||||
plugins.flow = flowPlugin;
|
||||
plugins.jsx = jsxPlugin;
|
||||
plugins.typescript = typescriptPlugin;
|
||||
|
||||
export function parse(input: string, options?: Options): File {
|
||||
if (options && options.sourceType === "unambiguous") {
|
||||
options = {
|
||||
...options,
|
||||
};
|
||||
try {
|
||||
options.sourceType = "module";
|
||||
const parser = getParser(options, input);
|
||||
const ast = parser.parse();
|
||||
|
||||
// Rather than try to parse as a script first, we opt to parse as a module and convert back
|
||||
// to a script where possible to avoid having to do a full re-parse of the input content.
|
||||
if (!parser.sawUnambiguousESM) ast.program.sourceType = "script";
|
||||
return ast;
|
||||
} catch (moduleError) {
|
||||
try {
|
||||
options.sourceType = "script";
|
||||
return getParser(options, input).parse();
|
||||
} catch (scriptError) {}
|
||||
|
||||
throw moduleError;
|
||||
}
|
||||
} else {
|
||||
return getParser(options, input).parse();
|
||||
}
|
||||
}
|
||||
|
||||
export function parseExpression(input: string, options?: Options): Expression {
|
||||
const parser = getParser(options, input);
|
||||
if (parser.options.strictMode) {
|
||||
parser.state.strict = true;
|
||||
}
|
||||
return parser.getExpression();
|
||||
}
|
||||
|
||||
export { tokTypes };
|
||||
|
||||
function getParser(options: ?Options, input: string): Parser {
|
||||
const cls =
|
||||
options && options.plugins ? getParserClass(options.plugins) : Parser;
|
||||
return new cls(options, input);
|
||||
}
|
||||
|
||||
const parserClassCache: { [key: string]: Class<Parser> } = {};
|
||||
|
||||
/** Get a Parser class with plugins applied. */
|
||||
function getParserClass(
|
||||
pluginsFromOptions: $ReadOnlyArray<string>,
|
||||
): Class<Parser> {
|
||||
if (
|
||||
hasPlugin(pluginsFromOptions, "decorators") &&
|
||||
hasPlugin(pluginsFromOptions, "decorators-legacy")
|
||||
) {
|
||||
throw new Error(
|
||||
"Cannot use the decorators and decorators-legacy plugin together",
|
||||
);
|
||||
}
|
||||
|
||||
// Filter out just the plugins that have an actual mixin associated with them.
|
||||
let pluginList = pluginsFromOptions.filter(plugin => {
|
||||
const p = getPluginName(plugin);
|
||||
return p === "estree" || p === "flow" || p === "jsx" || p === "typescript";
|
||||
});
|
||||
|
||||
if (hasPlugin(pluginList, "flow")) {
|
||||
// ensure flow plugin loads last
|
||||
pluginList = pluginList.filter(p => getPluginName(p) !== "flow");
|
||||
pluginList.push("flow");
|
||||
}
|
||||
|
||||
if (hasPlugin(pluginList, "flow") && hasPlugin(pluginList, "typescript")) {
|
||||
throw new Error("Cannot combine flow and typescript plugins.");
|
||||
}
|
||||
|
||||
if (hasPlugin(pluginList, "typescript")) {
|
||||
// ensure typescript plugin loads last
|
||||
pluginList = pluginList.filter(p => getPluginName(p) !== "typescript");
|
||||
pluginList.push("typescript");
|
||||
}
|
||||
|
||||
if (hasPlugin(pluginList, "estree")) {
|
||||
// ensure estree plugin loads first
|
||||
pluginList = pluginList.filter(p => getPluginName(p) !== "estree");
|
||||
pluginList.unshift("estree");
|
||||
}
|
||||
|
||||
const key = pluginList.join("/");
|
||||
let cls = parserClassCache[key];
|
||||
if (!cls) {
|
||||
cls = Parser;
|
||||
for (const plugin of pluginList) {
|
||||
cls = plugins[plugin](cls);
|
||||
}
|
||||
parserClassCache[key] = cls;
|
||||
}
|
||||
return cls;
|
||||
}
|
||||
|
||||
function getPluginName(plugin) {
|
||||
return Array.isArray(plugin) ? plugin[0] : plugin;
|
||||
}
|
||||
|
||||
function hasPlugin(pluginsList, name) {
|
||||
return pluginsList.some(plugin => getPluginName(plugin) === name);
|
||||
}
|
||||
64
packages/babel-parser/src/options.js
Executable file
64
packages/babel-parser/src/options.js
Executable file
@@ -0,0 +1,64 @@
|
||||
// @flow
|
||||
|
||||
// A second optional argument can be given to further configure
|
||||
// the parser process. These options are recognized:
|
||||
|
||||
export type Options = {
|
||||
sourceType: "script" | "module",
|
||||
sourceFilename?: string,
|
||||
startLine: number,
|
||||
allowAwaitOutsideFunction: boolean,
|
||||
allowReturnOutsideFunction: boolean,
|
||||
allowImportExportEverywhere: boolean,
|
||||
allowSuperOutsideMethod: boolean,
|
||||
plugins: $ReadOnlyArray<string>,
|
||||
strictMode: ?boolean,
|
||||
ranges: boolean,
|
||||
tokens: boolean,
|
||||
};
|
||||
|
||||
export const defaultOptions: Options = {
|
||||
// Source type ("script" or "module") for different semantics
|
||||
sourceType: "script",
|
||||
// Source filename.
|
||||
sourceFilename: undefined,
|
||||
// Line from which to start counting source. Useful for
|
||||
// integration with other tools.
|
||||
startLine: 1,
|
||||
// When enabled, await at the top level is not considered an
|
||||
// error.
|
||||
allowAwaitOutsideFunction: false,
|
||||
// When enabled, a return at the top level is not considered an
|
||||
// error.
|
||||
allowReturnOutsideFunction: false,
|
||||
// When enabled, import/export statements are not constrained to
|
||||
// appearing at the top of the program.
|
||||
allowImportExportEverywhere: false,
|
||||
// TODO
|
||||
allowSuperOutsideMethod: false,
|
||||
// An array of plugins to enable
|
||||
plugins: [],
|
||||
// TODO
|
||||
strictMode: null,
|
||||
// Nodes have their start and end characters offsets recorded in
|
||||
// `start` and `end` properties (directly on the node, rather than
|
||||
// the `loc` object, which holds line/column data. To also add a
|
||||
// [semi-standardized][range] `range` property holding a `[start,
|
||||
// end]` array with the same numbers, set the `ranges` option to
|
||||
// `true`.
|
||||
//
|
||||
// [range]: https://bugzilla.mozilla.org/show_bug.cgi?id=745678
|
||||
ranges: false,
|
||||
// Adds all parsed tokens to a `tokens` property on the `File` node
|
||||
tokens: false,
|
||||
};
|
||||
|
||||
// Interpret and default an options object
|
||||
|
||||
export function getOptions(opts: ?Options): Options {
|
||||
const options: any = {};
|
||||
for (const key in defaultOptions) {
|
||||
options[key] = opts && opts[key] != null ? opts[key] : defaultOptions[key];
|
||||
}
|
||||
return options;
|
||||
}
|
||||
35
packages/babel-parser/src/parser/base.js
Normal file
35
packages/babel-parser/src/parser/base.js
Normal file
@@ -0,0 +1,35 @@
|
||||
// @flow
|
||||
|
||||
import type { Options } from "../options";
|
||||
import { reservedWords } from "../util/identifier";
|
||||
|
||||
import type State from "../tokenizer/state";
|
||||
|
||||
export default class BaseParser {
|
||||
// Properties set by constructor in index.js
|
||||
options: Options;
|
||||
inModule: boolean;
|
||||
plugins: { [key: string]: boolean };
|
||||
filename: ?string;
|
||||
sawUnambiguousESM: boolean = false;
|
||||
|
||||
// Initialized by Tokenizer
|
||||
state: State;
|
||||
input: string;
|
||||
|
||||
isReservedWord(word: string): boolean {
|
||||
if (word === "await") {
|
||||
return this.inModule;
|
||||
} else {
|
||||
return reservedWords[6](word);
|
||||
}
|
||||
}
|
||||
|
||||
hasPlugin(name: string): boolean {
|
||||
return Object.hasOwnProperty.call(this.plugins, name);
|
||||
}
|
||||
|
||||
getPluginOption(plugin: string, name: string) {
|
||||
if (this.hasPlugin(plugin)) return this.plugins[plugin][name];
|
||||
}
|
||||
}
|
||||
225
packages/babel-parser/src/parser/comments.js
Normal file
225
packages/babel-parser/src/parser/comments.js
Normal file
@@ -0,0 +1,225 @@
|
||||
// @flow
|
||||
|
||||
/**
|
||||
* Based on the comment attachment algorithm used in espree and estraverse.
|
||||
*
|
||||
* Redistribution and use in source and binary forms, with or without
|
||||
* modification, are permitted provided that the following conditions are met:
|
||||
*
|
||||
* * Redistributions of source code must retain the above copyright
|
||||
* notice, this list of conditions and the following disclaimer.
|
||||
* * Redistributions in binary form must reproduce the above copyright
|
||||
* notice, this list of conditions and the following disclaimer in the
|
||||
* documentation and/or other materials provided with the distribution.
|
||||
*
|
||||
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
||||
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
||||
* ARE DISCLAIMED. IN NO EVENT SHALL <COPYRIGHT HOLDER> BE LIABLE FOR ANY
|
||||
* DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
|
||||
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
*/
|
||||
|
||||
import BaseParser from "./base";
|
||||
import type { Comment, Node } from "../types";
|
||||
|
||||
function last<T>(stack: $ReadOnlyArray<T>): T {
|
||||
return stack[stack.length - 1];
|
||||
}
|
||||
|
||||
export default class CommentsParser extends BaseParser {
|
||||
addComment(comment: Comment): void {
|
||||
if (this.filename) comment.loc.filename = this.filename;
|
||||
this.state.trailingComments.push(comment);
|
||||
this.state.leadingComments.push(comment);
|
||||
}
|
||||
|
||||
processComment(node: Node): void {
|
||||
if (node.type === "Program" && node.body.length > 0) return;
|
||||
|
||||
const stack = this.state.commentStack;
|
||||
|
||||
let firstChild, lastChild, trailingComments, i, j;
|
||||
|
||||
if (this.state.trailingComments.length > 0) {
|
||||
// If the first comment in trailingComments comes after the
|
||||
// current node, then we're good - all comments in the array will
|
||||
// come after the node and so it's safe to add them as official
|
||||
// trailingComments.
|
||||
if (this.state.trailingComments[0].start >= node.end) {
|
||||
trailingComments = this.state.trailingComments;
|
||||
this.state.trailingComments = [];
|
||||
} else {
|
||||
// Otherwise, if the first comment doesn't come after the
|
||||
// current node, that means we have a mix of leading and trailing
|
||||
// comments in the array and that leadingComments contains the
|
||||
// same items as trailingComments. Reset trailingComments to
|
||||
// zero items and we'll handle this by evaluating leadingComments
|
||||
// later.
|
||||
this.state.trailingComments.length = 0;
|
||||
}
|
||||
} else if (stack.length > 0) {
|
||||
const lastInStack = last(stack);
|
||||
if (
|
||||
lastInStack.trailingComments &&
|
||||
lastInStack.trailingComments[0].start >= node.end
|
||||
) {
|
||||
trailingComments = lastInStack.trailingComments;
|
||||
delete lastInStack.trailingComments;
|
||||
}
|
||||
}
|
||||
|
||||
// Eating the stack.
|
||||
if (stack.length > 0 && last(stack).start >= node.start) {
|
||||
firstChild = stack.pop();
|
||||
}
|
||||
|
||||
while (stack.length > 0 && last(stack).start >= node.start) {
|
||||
lastChild = stack.pop();
|
||||
}
|
||||
|
||||
if (!lastChild && firstChild) lastChild = firstChild;
|
||||
|
||||
// Attach comments that follow a trailing comma on the last
|
||||
// property in an object literal or a trailing comma in function arguments
|
||||
// as trailing comments
|
||||
if (firstChild && this.state.leadingComments.length > 0) {
|
||||
const lastComment = last(this.state.leadingComments);
|
||||
|
||||
if (firstChild.type === "ObjectProperty") {
|
||||
if (lastComment.start >= node.start) {
|
||||
if (this.state.commentPreviousNode) {
|
||||
for (j = 0; j < this.state.leadingComments.length; j++) {
|
||||
if (
|
||||
this.state.leadingComments[j].end <
|
||||
this.state.commentPreviousNode.end
|
||||
) {
|
||||
this.state.leadingComments.splice(j, 1);
|
||||
j--;
|
||||
}
|
||||
}
|
||||
|
||||
if (this.state.leadingComments.length > 0) {
|
||||
firstChild.trailingComments = this.state.leadingComments;
|
||||
this.state.leadingComments = [];
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if (
|
||||
node.type === "CallExpression" &&
|
||||
node.arguments &&
|
||||
node.arguments.length
|
||||
) {
|
||||
const lastArg = last(node.arguments);
|
||||
|
||||
if (
|
||||
lastArg &&
|
||||
lastComment.start >= lastArg.start &&
|
||||
lastComment.end <= node.end
|
||||
) {
|
||||
if (this.state.commentPreviousNode) {
|
||||
if (this.state.leadingComments.length > 0) {
|
||||
lastArg.trailingComments = this.state.leadingComments;
|
||||
this.state.leadingComments = [];
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (lastChild) {
|
||||
if (lastChild.leadingComments) {
|
||||
if (
|
||||
lastChild !== node &&
|
||||
lastChild.leadingComments.length > 0 &&
|
||||
last(lastChild.leadingComments).end <= node.start
|
||||
) {
|
||||
node.leadingComments = lastChild.leadingComments;
|
||||
delete lastChild.leadingComments;
|
||||
} else {
|
||||
// A leading comment for an anonymous class had been stolen by its first ClassMethod,
|
||||
// so this takes back the leading comment.
|
||||
// See also: https://github.com/eslint/espree/issues/158
|
||||
for (i = lastChild.leadingComments.length - 2; i >= 0; --i) {
|
||||
if (lastChild.leadingComments[i].end <= node.start) {
|
||||
node.leadingComments = lastChild.leadingComments.splice(0, i + 1);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if (this.state.leadingComments.length > 0) {
|
||||
if (last(this.state.leadingComments).end <= node.start) {
|
||||
if (this.state.commentPreviousNode) {
|
||||
for (j = 0; j < this.state.leadingComments.length; j++) {
|
||||
if (
|
||||
this.state.leadingComments[j].end <
|
||||
this.state.commentPreviousNode.end
|
||||
) {
|
||||
this.state.leadingComments.splice(j, 1);
|
||||
j--;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (this.state.leadingComments.length > 0) {
|
||||
node.leadingComments = this.state.leadingComments;
|
||||
this.state.leadingComments = [];
|
||||
}
|
||||
} else {
|
||||
// https://github.com/eslint/espree/issues/2
|
||||
//
|
||||
// In special cases, such as return (without a value) and
|
||||
// debugger, all comments will end up as leadingComments and
|
||||
// will otherwise be eliminated. This step runs when the
|
||||
// commentStack is empty and there are comments left
|
||||
// in leadingComments.
|
||||
//
|
||||
// This loop figures out the stopping point between the actual
|
||||
// leading and trailing comments by finding the location of the
|
||||
// first comment that comes after the given node.
|
||||
for (i = 0; i < this.state.leadingComments.length; i++) {
|
||||
if (this.state.leadingComments[i].end > node.start) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
// Split the array based on the location of the first comment
|
||||
// that comes after the node. Keep in mind that this could
|
||||
// result in an empty array, and if so, the array must be
|
||||
// deleted.
|
||||
const leadingComments = this.state.leadingComments.slice(0, i);
|
||||
|
||||
if (leadingComments.length) {
|
||||
node.leadingComments = leadingComments;
|
||||
}
|
||||
|
||||
// Similarly, trailing comments are attached later. The variable
|
||||
// must be reset to null if there are no trailing comments.
|
||||
trailingComments = this.state.leadingComments.slice(i);
|
||||
if (trailingComments.length === 0) {
|
||||
trailingComments = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
this.state.commentPreviousNode = node;
|
||||
|
||||
if (trailingComments) {
|
||||
if (
|
||||
trailingComments.length &&
|
||||
trailingComments[0].start >= node.start &&
|
||||
last(trailingComments).end <= node.end
|
||||
) {
|
||||
node.innerComments = trailingComments;
|
||||
} else {
|
||||
node.trailingComments = trailingComments;
|
||||
}
|
||||
}
|
||||
|
||||
stack.push(node);
|
||||
}
|
||||
}
|
||||
1934
packages/babel-parser/src/parser/expression.js
Normal file
1934
packages/babel-parser/src/parser/expression.js
Normal file
File diff suppressed because it is too large
Load Diff
50
packages/babel-parser/src/parser/index.js
Normal file
50
packages/babel-parser/src/parser/index.js
Normal file
@@ -0,0 +1,50 @@
|
||||
// @flow
|
||||
|
||||
import type { Options } from "../options";
|
||||
import type { File } from "../types";
|
||||
import { getOptions } from "../options";
|
||||
import StatementParser from "./statement";
|
||||
|
||||
export const plugins: {
|
||||
[name: string]: (superClass: Class<Parser>) => Class<Parser>,
|
||||
} = {};
|
||||
|
||||
export default class Parser extends StatementParser {
|
||||
constructor(options: ?Options, input: string) {
|
||||
options = getOptions(options);
|
||||
super(options, input);
|
||||
|
||||
this.options = options;
|
||||
this.inModule = this.options.sourceType === "module";
|
||||
this.input = input;
|
||||
this.plugins = pluginsMap(this.options.plugins);
|
||||
this.filename = options.sourceFilename;
|
||||
|
||||
// If enabled, skip leading hashbang line.
|
||||
if (
|
||||
this.state.pos === 0 &&
|
||||
this.input[0] === "#" &&
|
||||
this.input[1] === "!"
|
||||
) {
|
||||
this.skipLineComment(2);
|
||||
}
|
||||
}
|
||||
|
||||
parse(): File {
|
||||
const file = this.startNode();
|
||||
const program = this.startNode();
|
||||
this.nextToken();
|
||||
return this.parseTopLevel(file, program);
|
||||
}
|
||||
}
|
||||
|
||||
function pluginsMap(
|
||||
pluginList: $ReadOnlyArray<string>,
|
||||
): { [key: string]: boolean } {
|
||||
const pluginMap = Object.create(null);
|
||||
for (const plugin of pluginList) {
|
||||
const [name, options = {}] = Array.isArray(plugin) ? plugin : [plugin];
|
||||
pluginMap[name] = options;
|
||||
}
|
||||
return pluginMap;
|
||||
}
|
||||
40
packages/babel-parser/src/parser/location.js
Normal file
40
packages/babel-parser/src/parser/location.js
Normal file
@@ -0,0 +1,40 @@
|
||||
// @flow
|
||||
|
||||
import { getLineInfo, type Position } from "../util/location";
|
||||
import CommentsParser from "./comments";
|
||||
|
||||
// This function is used to raise exceptions on parse errors. It
|
||||
// takes an offset integer (into the current `input`) to indicate
|
||||
// the location of the error, attaches the position to the end
|
||||
// of the error message, and then raises a `SyntaxError` with that
|
||||
// message.
|
||||
|
||||
export default class LocationParser extends CommentsParser {
|
||||
raise(
|
||||
pos: number,
|
||||
message: string,
|
||||
{
|
||||
missingPluginNames,
|
||||
code,
|
||||
}: {
|
||||
missingPluginNames?: Array<string>,
|
||||
code?: string,
|
||||
} = {},
|
||||
): empty {
|
||||
const loc = getLineInfo(this.input, pos);
|
||||
message += ` (${loc.line}:${loc.column})`;
|
||||
// $FlowIgnore
|
||||
const err: SyntaxError & { pos: number, loc: Position } = new SyntaxError(
|
||||
message,
|
||||
);
|
||||
err.pos = pos;
|
||||
err.loc = loc;
|
||||
if (missingPluginNames) {
|
||||
err.missingPlugin = missingPluginNames;
|
||||
}
|
||||
if (code !== undefined) {
|
||||
err.code = code;
|
||||
}
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
414
packages/babel-parser/src/parser/lval.js
Normal file
414
packages/babel-parser/src/parser/lval.js
Normal file
@@ -0,0 +1,414 @@
|
||||
// @flow
|
||||
|
||||
import { types as tt, type TokenType } from "../tokenizer/types";
|
||||
import type {
|
||||
TSParameterProperty,
|
||||
Decorator,
|
||||
Expression,
|
||||
Identifier,
|
||||
Node,
|
||||
ObjectExpression,
|
||||
ObjectPattern,
|
||||
Pattern,
|
||||
RestElement,
|
||||
SpreadElement,
|
||||
} from "../types";
|
||||
import type { Pos, Position } from "../util/location";
|
||||
import { NodeUtils } from "./node";
|
||||
|
||||
export default class LValParser extends NodeUtils {
|
||||
// Forward-declaration: defined in expression.js
|
||||
+checkReservedWord: (
|
||||
word: string,
|
||||
startLoc: number,
|
||||
checkKeywords: boolean,
|
||||
isBinding: boolean,
|
||||
) => void;
|
||||
+parseIdentifier: (liberal?: boolean) => Identifier;
|
||||
+parseMaybeAssign: (
|
||||
noIn?: ?boolean,
|
||||
refShorthandDefaultPos?: ?Pos,
|
||||
afterLeftParse?: Function,
|
||||
refNeedsArrowPos?: ?Pos,
|
||||
) => Expression;
|
||||
+parseObj: <T: ObjectPattern | ObjectExpression>(
|
||||
isPattern: boolean,
|
||||
refShorthandDefaultPos?: ?Pos,
|
||||
) => T;
|
||||
// Forward-declaration: defined in statement.js
|
||||
+parseDecorator: () => Decorator;
|
||||
|
||||
// Convert existing expression atom to assignable pattern
|
||||
// if possible.
|
||||
|
||||
toAssignable(
|
||||
node: Node,
|
||||
isBinding: ?boolean,
|
||||
contextDescription: string,
|
||||
): Node {
|
||||
if (node) {
|
||||
switch (node.type) {
|
||||
case "Identifier":
|
||||
case "ObjectPattern":
|
||||
case "ArrayPattern":
|
||||
case "AssignmentPattern":
|
||||
break;
|
||||
|
||||
case "ObjectExpression":
|
||||
node.type = "ObjectPattern";
|
||||
for (let index = 0; index < node.properties.length; index++) {
|
||||
const prop = node.properties[index];
|
||||
const isLast = index === node.properties.length - 1;
|
||||
this.toAssignableObjectExpressionProp(prop, isBinding, isLast);
|
||||
}
|
||||
break;
|
||||
|
||||
case "ObjectProperty":
|
||||
this.toAssignable(node.value, isBinding, contextDescription);
|
||||
break;
|
||||
|
||||
case "SpreadElement": {
|
||||
this.checkToRestConversion(node);
|
||||
|
||||
node.type = "RestElement";
|
||||
const arg = node.argument;
|
||||
this.toAssignable(arg, isBinding, contextDescription);
|
||||
break;
|
||||
}
|
||||
|
||||
case "ArrayExpression":
|
||||
node.type = "ArrayPattern";
|
||||
this.toAssignableList(node.elements, isBinding, contextDescription);
|
||||
break;
|
||||
|
||||
case "AssignmentExpression":
|
||||
if (node.operator === "=") {
|
||||
node.type = "AssignmentPattern";
|
||||
delete node.operator;
|
||||
} else {
|
||||
this.raise(
|
||||
node.left.end,
|
||||
"Only '=' operator can be used for specifying default value.",
|
||||
);
|
||||
}
|
||||
break;
|
||||
|
||||
case "MemberExpression":
|
||||
if (!isBinding) break;
|
||||
|
||||
default: {
|
||||
const message =
|
||||
"Invalid left-hand side" +
|
||||
(contextDescription
|
||||
? " in " + contextDescription
|
||||
: /* istanbul ignore next */ "expression");
|
||||
this.raise(node.start, message);
|
||||
}
|
||||
}
|
||||
}
|
||||
return node;
|
||||
}
|
||||
|
||||
toAssignableObjectExpressionProp(
|
||||
prop: Node,
|
||||
isBinding: ?boolean,
|
||||
isLast: boolean,
|
||||
) {
|
||||
if (prop.type === "ObjectMethod") {
|
||||
const error =
|
||||
prop.kind === "get" || prop.kind === "set"
|
||||
? "Object pattern can't contain getter or setter"
|
||||
: "Object pattern can't contain methods";
|
||||
|
||||
this.raise(prop.key.start, error);
|
||||
} else if (prop.type === "SpreadElement" && !isLast) {
|
||||
this.raise(
|
||||
prop.start,
|
||||
"The rest element has to be the last element when destructuring",
|
||||
);
|
||||
} else {
|
||||
this.toAssignable(prop, isBinding, "object destructuring pattern");
|
||||
}
|
||||
}
|
||||
|
||||
// Convert list of expression atoms to binding list.
|
||||
|
||||
toAssignableList(
|
||||
exprList: Expression[],
|
||||
isBinding: ?boolean,
|
||||
contextDescription: string,
|
||||
): $ReadOnlyArray<Pattern> {
|
||||
let end = exprList.length;
|
||||
if (end) {
|
||||
const last = exprList[end - 1];
|
||||
if (last && last.type === "RestElement") {
|
||||
--end;
|
||||
} else if (last && last.type === "SpreadElement") {
|
||||
last.type = "RestElement";
|
||||
const arg = last.argument;
|
||||
this.toAssignable(arg, isBinding, contextDescription);
|
||||
if (
|
||||
[
|
||||
"Identifier",
|
||||
"MemberExpression",
|
||||
"ArrayPattern",
|
||||
"ObjectPattern",
|
||||
].indexOf(arg.type) === -1
|
||||
) {
|
||||
this.unexpected(arg.start);
|
||||
}
|
||||
--end;
|
||||
}
|
||||
}
|
||||
for (let i = 0; i < end; i++) {
|
||||
const elt = exprList[i];
|
||||
if (elt && elt.type === "SpreadElement") {
|
||||
this.raise(
|
||||
elt.start,
|
||||
"The rest element has to be the last element when destructuring",
|
||||
);
|
||||
}
|
||||
if (elt) this.toAssignable(elt, isBinding, contextDescription);
|
||||
}
|
||||
return exprList;
|
||||
}
|
||||
|
||||
// Convert list of expression atoms to a list of
|
||||
|
||||
toReferencedList(
|
||||
exprList: $ReadOnlyArray<?Expression>,
|
||||
): $ReadOnlyArray<?Expression> {
|
||||
return exprList;
|
||||
}
|
||||
|
||||
// Parses spread element.
|
||||
|
||||
parseSpread<T: RestElement | SpreadElement>(
|
||||
refShorthandDefaultPos: ?Pos,
|
||||
refNeedsArrowPos?: ?Pos,
|
||||
): T {
|
||||
const node = this.startNode();
|
||||
this.next();
|
||||
node.argument = this.parseMaybeAssign(
|
||||
false,
|
||||
refShorthandDefaultPos,
|
||||
undefined,
|
||||
refNeedsArrowPos,
|
||||
);
|
||||
return this.finishNode(node, "SpreadElement");
|
||||
}
|
||||
|
||||
parseRest(): RestElement {
|
||||
const node = this.startNode();
|
||||
this.next();
|
||||
node.argument = this.parseBindingAtom();
|
||||
return this.finishNode(node, "RestElement");
|
||||
}
|
||||
|
||||
shouldAllowYieldIdentifier(): boolean {
|
||||
return (
|
||||
this.match(tt._yield) && !this.state.strict && !this.state.inGenerator
|
||||
);
|
||||
}
|
||||
|
||||
parseBindingIdentifier(): Identifier {
|
||||
return this.parseIdentifier(this.shouldAllowYieldIdentifier());
|
||||
}
|
||||
|
||||
// Parses lvalue (assignable) atom.
|
||||
parseBindingAtom(): Pattern {
|
||||
switch (this.state.type) {
|
||||
case tt._yield:
|
||||
case tt.name:
|
||||
return this.parseBindingIdentifier();
|
||||
|
||||
case tt.bracketL: {
|
||||
const node = this.startNode();
|
||||
this.next();
|
||||
node.elements = this.parseBindingList(tt.bracketR, true);
|
||||
return this.finishNode(node, "ArrayPattern");
|
||||
}
|
||||
|
||||
case tt.braceL:
|
||||
return this.parseObj(true);
|
||||
|
||||
default:
|
||||
throw this.unexpected();
|
||||
}
|
||||
}
|
||||
|
||||
parseBindingList(
|
||||
close: TokenType,
|
||||
allowEmpty?: boolean,
|
||||
allowModifiers?: boolean,
|
||||
): $ReadOnlyArray<Pattern | TSParameterProperty> {
|
||||
const elts: Array<Pattern | TSParameterProperty> = [];
|
||||
let first = true;
|
||||
while (!this.eat(close)) {
|
||||
if (first) {
|
||||
first = false;
|
||||
} else {
|
||||
this.expect(tt.comma);
|
||||
}
|
||||
if (allowEmpty && this.match(tt.comma)) {
|
||||
// $FlowFixMe This method returns `$ReadOnlyArray<?Pattern>` if `allowEmpty` is set.
|
||||
elts.push(null);
|
||||
} else if (this.eat(close)) {
|
||||
break;
|
||||
} else if (this.match(tt.ellipsis)) {
|
||||
elts.push(this.parseAssignableListItemTypes(this.parseRest()));
|
||||
this.expect(close);
|
||||
break;
|
||||
} else {
|
||||
const decorators = [];
|
||||
if (this.match(tt.at) && this.hasPlugin("decorators")) {
|
||||
this.raise(
|
||||
this.state.start,
|
||||
"Stage 2 decorators cannot be used to decorate parameters",
|
||||
);
|
||||
}
|
||||
while (this.match(tt.at)) {
|
||||
decorators.push(this.parseDecorator());
|
||||
}
|
||||
elts.push(this.parseAssignableListItem(allowModifiers, decorators));
|
||||
}
|
||||
}
|
||||
return elts;
|
||||
}
|
||||
|
||||
parseAssignableListItem(
|
||||
allowModifiers: ?boolean,
|
||||
decorators: Decorator[],
|
||||
): Pattern | TSParameterProperty {
|
||||
const left = this.parseMaybeDefault();
|
||||
this.parseAssignableListItemTypes(left);
|
||||
const elt = this.parseMaybeDefault(left.start, left.loc.start, left);
|
||||
if (decorators.length) {
|
||||
left.decorators = decorators;
|
||||
}
|
||||
return elt;
|
||||
}
|
||||
|
||||
parseAssignableListItemTypes(param: Pattern): Pattern {
|
||||
return param;
|
||||
}
|
||||
|
||||
// Parses assignment pattern around given atom if possible.
|
||||
|
||||
parseMaybeDefault(
|
||||
startPos?: ?number,
|
||||
startLoc?: ?Position,
|
||||
left?: ?Pattern,
|
||||
): Pattern {
|
||||
startLoc = startLoc || this.state.startLoc;
|
||||
startPos = startPos || this.state.start;
|
||||
left = left || this.parseBindingAtom();
|
||||
if (!this.eat(tt.eq)) return left;
|
||||
|
||||
const node = this.startNodeAt(startPos, startLoc);
|
||||
node.left = left;
|
||||
node.right = this.parseMaybeAssign();
|
||||
return this.finishNode(node, "AssignmentPattern");
|
||||
}
|
||||
|
||||
// Verify that a node is an lval — something that can be assigned
|
||||
// to.
|
||||
|
||||
checkLVal(
|
||||
expr: Expression,
|
||||
isBinding: ?boolean,
|
||||
checkClashes: ?{ [key: string]: boolean },
|
||||
contextDescription: string,
|
||||
): void {
|
||||
switch (expr.type) {
|
||||
case "Identifier":
|
||||
this.checkReservedWord(expr.name, expr.start, false, true);
|
||||
|
||||
if (checkClashes) {
|
||||
// we need to prefix this with an underscore for the cases where we have a key of
|
||||
// `__proto__`. there's a bug in old V8 where the following wouldn't work:
|
||||
//
|
||||
// > var obj = Object.create(null);
|
||||
// undefined
|
||||
// > obj.__proto__
|
||||
// null
|
||||
// > obj.__proto__ = true;
|
||||
// true
|
||||
// > obj.__proto__
|
||||
// null
|
||||
const key = `_${expr.name}`;
|
||||
|
||||
if (checkClashes[key]) {
|
||||
this.raise(expr.start, "Argument name clash in strict mode");
|
||||
} else {
|
||||
checkClashes[key] = true;
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
case "MemberExpression":
|
||||
if (isBinding) this.raise(expr.start, "Binding member expression");
|
||||
break;
|
||||
|
||||
case "ObjectPattern":
|
||||
for (let prop of expr.properties) {
|
||||
if (prop.type === "ObjectProperty") prop = prop.value;
|
||||
this.checkLVal(
|
||||
prop,
|
||||
isBinding,
|
||||
checkClashes,
|
||||
"object destructuring pattern",
|
||||
);
|
||||
}
|
||||
break;
|
||||
|
||||
case "ArrayPattern":
|
||||
for (const elem of expr.elements) {
|
||||
if (elem) {
|
||||
this.checkLVal(
|
||||
elem,
|
||||
isBinding,
|
||||
checkClashes,
|
||||
"array destructuring pattern",
|
||||
);
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
case "AssignmentPattern":
|
||||
this.checkLVal(
|
||||
expr.left,
|
||||
isBinding,
|
||||
checkClashes,
|
||||
"assignment pattern",
|
||||
);
|
||||
break;
|
||||
|
||||
case "RestElement":
|
||||
this.checkLVal(expr.argument, isBinding, checkClashes, "rest element");
|
||||
break;
|
||||
|
||||
default: {
|
||||
const message =
|
||||
(isBinding
|
||||
? /* istanbul ignore next */ "Binding invalid"
|
||||
: "Invalid") +
|
||||
" left-hand side" +
|
||||
(contextDescription
|
||||
? " in " + contextDescription
|
||||
: /* istanbul ignore next */ "expression");
|
||||
this.raise(expr.start, message);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
checkToRestConversion(node: SpreadElement): void {
|
||||
const validArgumentTypes = ["Identifier", "MemberExpression"];
|
||||
|
||||
if (validArgumentTypes.indexOf(node.argument.type) !== -1) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.raise(node.argument.start, "Invalid rest operator's argument");
|
||||
}
|
||||
}
|
||||
98
packages/babel-parser/src/parser/node.js
Normal file
98
packages/babel-parser/src/parser/node.js
Normal file
@@ -0,0 +1,98 @@
|
||||
// @flow
|
||||
|
||||
import Parser from "./index";
|
||||
import UtilParser from "./util";
|
||||
import { SourceLocation, type Position } from "../util/location";
|
||||
import type { Comment, Node as NodeType, NodeBase } from "../types";
|
||||
|
||||
// Start an AST node, attaching a start offset.
|
||||
|
||||
const commentKeys = ["leadingComments", "trailingComments", "innerComments"];
|
||||
|
||||
class Node implements NodeBase {
|
||||
constructor(parser: Parser, pos: number, loc: Position) {
|
||||
this.type = "";
|
||||
this.start = pos;
|
||||
this.end = 0;
|
||||
this.loc = new SourceLocation(loc);
|
||||
if (parser && parser.options.ranges) this.range = [pos, 0];
|
||||
if (parser && parser.filename) this.loc.filename = parser.filename;
|
||||
}
|
||||
|
||||
type: string;
|
||||
start: number;
|
||||
end: number;
|
||||
loc: SourceLocation;
|
||||
range: [number, number];
|
||||
leadingComments: Array<Comment>;
|
||||
trailingComments: Array<Comment>;
|
||||
innerComments: Array<Comment>;
|
||||
extra: { [key: string]: any };
|
||||
|
||||
__clone(): this {
|
||||
// $FlowIgnore
|
||||
const node2: any = new Node();
|
||||
Object.keys(this).forEach(key => {
|
||||
// Do not clone comments that are already attached to the node
|
||||
if (commentKeys.indexOf(key) < 0) {
|
||||
// $FlowIgnore
|
||||
node2[key] = this[key];
|
||||
}
|
||||
});
|
||||
|
||||
return node2;
|
||||
}
|
||||
}
|
||||
|
||||
export class NodeUtils extends UtilParser {
|
||||
startNode<T: NodeType>(): T {
|
||||
// $FlowIgnore
|
||||
return new Node(this, this.state.start, this.state.startLoc);
|
||||
}
|
||||
|
||||
startNodeAt<T: NodeType>(pos: number, loc: Position): T {
|
||||
// $FlowIgnore
|
||||
return new Node(this, pos, loc);
|
||||
}
|
||||
|
||||
/** Start a new node with a previous node's location. */
|
||||
startNodeAtNode<T: NodeType>(type: NodeType): T {
|
||||
return this.startNodeAt(type.start, type.loc.start);
|
||||
}
|
||||
|
||||
// Finish an AST node, adding `type` and `end` properties.
|
||||
|
||||
finishNode<T: NodeType>(node: T, type: string): T {
|
||||
return this.finishNodeAt(
|
||||
node,
|
||||
type,
|
||||
this.state.lastTokEnd,
|
||||
this.state.lastTokEndLoc,
|
||||
);
|
||||
}
|
||||
|
||||
// Finish node at given position
|
||||
|
||||
finishNodeAt<T: NodeType>(
|
||||
node: T,
|
||||
type: string,
|
||||
pos: number,
|
||||
loc: Position,
|
||||
): T {
|
||||
node.type = type;
|
||||
node.end = pos;
|
||||
node.loc.end = loc;
|
||||
if (this.options.ranges) node.range[1] = pos;
|
||||
this.processComment(node);
|
||||
return node;
|
||||
}
|
||||
|
||||
/**
|
||||
* Reset the start location of node to the start location of locationNode
|
||||
*/
|
||||
resetStartLocationFromNode(node: NodeBase, locationNode: NodeBase): void {
|
||||
node.start = locationNode.start;
|
||||
node.loc.start = locationNode.loc.start;
|
||||
if (this.options.ranges) node.range[0] = locationNode.range[0];
|
||||
}
|
||||
}
|
||||
1785
packages/babel-parser/src/parser/statement.js
Normal file
1785
packages/babel-parser/src/parser/statement.js
Normal file
File diff suppressed because it is too large
Load Diff
150
packages/babel-parser/src/parser/util.js
Normal file
150
packages/babel-parser/src/parser/util.js
Normal file
@@ -0,0 +1,150 @@
|
||||
// @flow
|
||||
|
||||
import { types as tt, type TokenType } from "../tokenizer/types";
|
||||
import Tokenizer from "../tokenizer";
|
||||
import type { Node } from "../types";
|
||||
import { lineBreak } from "../util/whitespace";
|
||||
|
||||
// ## Parser utilities
|
||||
|
||||
export default class UtilParser extends Tokenizer {
|
||||
// TODO
|
||||
|
||||
addExtra(node: Node, key: string, val: any): void {
|
||||
if (!node) return;
|
||||
|
||||
const extra = (node.extra = node.extra || {});
|
||||
extra[key] = val;
|
||||
}
|
||||
|
||||
// TODO
|
||||
|
||||
isRelational(op: "<" | ">"): boolean {
|
||||
return this.match(tt.relational) && this.state.value === op;
|
||||
}
|
||||
|
||||
isLookaheadRelational(op: "<" | ">"): boolean {
|
||||
const l = this.lookahead();
|
||||
return l.type == tt.relational && l.value == op;
|
||||
}
|
||||
|
||||
// TODO
|
||||
|
||||
expectRelational(op: "<" | ">"): void {
|
||||
if (this.isRelational(op)) {
|
||||
this.next();
|
||||
} else {
|
||||
this.unexpected(null, tt.relational);
|
||||
}
|
||||
}
|
||||
|
||||
// eat() for relational operators.
|
||||
|
||||
eatRelational(op: "<" | ">"): boolean {
|
||||
if (this.isRelational(op)) {
|
||||
this.next();
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// Tests whether parsed token is a contextual keyword.
|
||||
|
||||
isContextual(name: string): boolean {
|
||||
return (
|
||||
this.match(tt.name) &&
|
||||
this.state.value === name &&
|
||||
!this.state.containsEsc
|
||||
);
|
||||
}
|
||||
|
||||
isLookaheadContextual(name: string): boolean {
|
||||
const l = this.lookahead();
|
||||
return l.type === tt.name && l.value === name;
|
||||
}
|
||||
|
||||
// Consumes contextual keyword if possible.
|
||||
|
||||
eatContextual(name: string): boolean {
|
||||
return this.isContextual(name) && this.eat(tt.name);
|
||||
}
|
||||
|
||||
// Asserts that following token is given contextual keyword.
|
||||
|
||||
expectContextual(name: string, message?: string): void {
|
||||
if (!this.eatContextual(name)) this.unexpected(null, message);
|
||||
}
|
||||
|
||||
// Test whether a semicolon can be inserted at the current position.
|
||||
|
||||
canInsertSemicolon(): boolean {
|
||||
return (
|
||||
this.match(tt.eof) ||
|
||||
this.match(tt.braceR) ||
|
||||
this.hasPrecedingLineBreak()
|
||||
);
|
||||
}
|
||||
|
||||
hasPrecedingLineBreak(): boolean {
|
||||
return lineBreak.test(
|
||||
this.input.slice(this.state.lastTokEnd, this.state.start),
|
||||
);
|
||||
}
|
||||
|
||||
// TODO
|
||||
|
||||
isLineTerminator(): boolean {
|
||||
return this.eat(tt.semi) || this.canInsertSemicolon();
|
||||
}
|
||||
|
||||
// Consume a semicolon, or, failing that, see if we are allowed to
|
||||
// pretend that there is a semicolon at this position.
|
||||
|
||||
semicolon(): void {
|
||||
if (!this.isLineTerminator()) this.unexpected(null, tt.semi);
|
||||
}
|
||||
|
||||
// Expect a token of a given type. If found, consume it, otherwise,
|
||||
// raise an unexpected token error at given pos.
|
||||
|
||||
expect(type: TokenType, pos?: ?number): void {
|
||||
this.eat(type) || this.unexpected(pos, type);
|
||||
}
|
||||
|
||||
// Raise an unexpected token error. Can take the expected token type
|
||||
// instead of a message string.
|
||||
|
||||
unexpected(
|
||||
pos: ?number,
|
||||
messageOrType: string | TokenType = "Unexpected token",
|
||||
): empty {
|
||||
if (typeof messageOrType !== "string") {
|
||||
messageOrType = `Unexpected token, expected "${messageOrType.label}"`;
|
||||
}
|
||||
throw this.raise(pos != null ? pos : this.state.start, messageOrType);
|
||||
}
|
||||
|
||||
expectPlugin(name: string, pos?: ?number): true {
|
||||
if (!this.hasPlugin(name)) {
|
||||
throw this.raise(
|
||||
pos != null ? pos : this.state.start,
|
||||
`This experimental syntax requires enabling the parser plugin: '${name}'`,
|
||||
{ missingPluginNames: [name] },
|
||||
);
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
expectOnePlugin(names: Array<string>, pos?: ?number): void {
|
||||
if (!names.some(n => this.hasPlugin(n))) {
|
||||
throw this.raise(
|
||||
pos != null ? pos : this.state.start,
|
||||
`This experimental syntax requires enabling one of the following parser plugin(s): '${names.join(
|
||||
", ",
|
||||
)}'`,
|
||||
{ missingPluginNames: names },
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
363
packages/babel-parser/src/plugins/estree.js
Normal file
363
packages/babel-parser/src/plugins/estree.js
Normal file
@@ -0,0 +1,363 @@
|
||||
// @flow
|
||||
|
||||
import { types as tt, TokenType } from "../tokenizer/types";
|
||||
import type Parser from "../parser";
|
||||
import * as N from "../types";
|
||||
import type { Pos, Position } from "../util/location";
|
||||
|
||||
function isSimpleProperty(node: N.Node): boolean {
|
||||
return (
|
||||
node != null &&
|
||||
node.type === "Property" &&
|
||||
node.kind === "init" &&
|
||||
node.method === false
|
||||
);
|
||||
}
|
||||
|
||||
export default (superClass: Class<Parser>): Class<Parser> =>
|
||||
class extends superClass {
|
||||
estreeParseRegExpLiteral({ pattern, flags }: N.RegExpLiteral): N.Node {
|
||||
let regex = null;
|
||||
try {
|
||||
regex = new RegExp(pattern, flags);
|
||||
} catch (e) {
|
||||
// In environments that don't support these flags value will
|
||||
// be null as the regex can't be represented natively.
|
||||
}
|
||||
const node = this.estreeParseLiteral(regex);
|
||||
node.regex = { pattern, flags };
|
||||
|
||||
return node;
|
||||
}
|
||||
|
||||
estreeParseLiteral(value: any): N.Node {
|
||||
return this.parseLiteral(value, "Literal");
|
||||
}
|
||||
|
||||
directiveToStmt(directive: N.Directive): N.ExpressionStatement {
|
||||
const directiveLiteral = directive.value;
|
||||
|
||||
const stmt = this.startNodeAt(directive.start, directive.loc.start);
|
||||
const expression = this.startNodeAt(
|
||||
directiveLiteral.start,
|
||||
directiveLiteral.loc.start,
|
||||
);
|
||||
|
||||
expression.value = directiveLiteral.value;
|
||||
expression.raw = directiveLiteral.extra.raw;
|
||||
|
||||
stmt.expression = this.finishNodeAt(
|
||||
expression,
|
||||
"Literal",
|
||||
directiveLiteral.end,
|
||||
directiveLiteral.loc.end,
|
||||
);
|
||||
stmt.directive = directiveLiteral.extra.raw.slice(1, -1);
|
||||
|
||||
return this.finishNodeAt(
|
||||
stmt,
|
||||
"ExpressionStatement",
|
||||
directive.end,
|
||||
directive.loc.end,
|
||||
);
|
||||
}
|
||||
|
||||
// ==================================
|
||||
// Overrides
|
||||
// ==================================
|
||||
|
||||
initFunction(
|
||||
node: N.BodilessFunctionOrMethodBase,
|
||||
isAsync: ?boolean,
|
||||
): void {
|
||||
super.initFunction(node, isAsync);
|
||||
node.expression = false;
|
||||
}
|
||||
|
||||
checkDeclaration(node: N.Pattern | N.ObjectProperty): void {
|
||||
if (isSimpleProperty(node)) {
|
||||
this.checkDeclaration(((node: any): N.EstreeProperty).value);
|
||||
} else {
|
||||
super.checkDeclaration(node);
|
||||
}
|
||||
}
|
||||
|
||||
checkGetterSetterParams(method: N.ObjectMethod | N.ClassMethod): void {
|
||||
const prop = ((method: any): N.EstreeProperty | N.EstreeMethodDefinition);
|
||||
const paramCount = prop.kind === "get" ? 0 : 1;
|
||||
const start = prop.start;
|
||||
if (prop.value.params.length !== paramCount) {
|
||||
if (prop.kind === "get") {
|
||||
this.raise(start, "getter must not have any formal parameters");
|
||||
} else {
|
||||
this.raise(start, "setter must have exactly one formal parameter");
|
||||
}
|
||||
}
|
||||
|
||||
if (prop.kind === "set" && prop.value.params[0].type === "RestElement") {
|
||||
this.raise(
|
||||
start,
|
||||
"setter function argument must not be a rest parameter",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
checkLVal(
|
||||
expr: N.Expression,
|
||||
isBinding: ?boolean,
|
||||
checkClashes: ?{ [key: string]: boolean },
|
||||
contextDescription: string,
|
||||
): void {
|
||||
switch (expr.type) {
|
||||
case "ObjectPattern":
|
||||
expr.properties.forEach(prop => {
|
||||
this.checkLVal(
|
||||
prop.type === "Property" ? prop.value : prop,
|
||||
isBinding,
|
||||
checkClashes,
|
||||
"object destructuring pattern",
|
||||
);
|
||||
});
|
||||
break;
|
||||
default:
|
||||
super.checkLVal(expr, isBinding, checkClashes, contextDescription);
|
||||
}
|
||||
}
|
||||
|
||||
checkPropClash(
|
||||
prop: N.ObjectMember,
|
||||
propHash: { [key: string]: boolean },
|
||||
): void {
|
||||
if (prop.computed || !isSimpleProperty(prop)) return;
|
||||
|
||||
const key = prop.key;
|
||||
// It is either an Identifier or a String/NumericLiteral
|
||||
const name = key.type === "Identifier" ? key.name : String(key.value);
|
||||
|
||||
if (name === "__proto__") {
|
||||
if (propHash.proto) {
|
||||
this.raise(key.start, "Redefinition of __proto__ property");
|
||||
}
|
||||
propHash.proto = true;
|
||||
}
|
||||
}
|
||||
|
||||
isStrictBody(node: { body: N.BlockStatement }): boolean {
|
||||
const isBlockStatement = node.body.type === "BlockStatement";
|
||||
|
||||
if (isBlockStatement && node.body.body.length > 0) {
|
||||
for (const directive of node.body.body) {
|
||||
if (
|
||||
directive.type === "ExpressionStatement" &&
|
||||
directive.expression.type === "Literal"
|
||||
) {
|
||||
if (directive.expression.value === "use strict") return true;
|
||||
} else {
|
||||
// Break for the first non literal expression
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
isValidDirective(stmt: N.Statement): boolean {
|
||||
return (
|
||||
stmt.type === "ExpressionStatement" &&
|
||||
stmt.expression.type === "Literal" &&
|
||||
typeof stmt.expression.value === "string" &&
|
||||
(!stmt.expression.extra || !stmt.expression.extra.parenthesized)
|
||||
);
|
||||
}
|
||||
|
||||
stmtToDirective(stmt: N.Statement): N.Directive {
|
||||
const directive = super.stmtToDirective(stmt);
|
||||
const value = stmt.expression.value;
|
||||
|
||||
// Reset value to the actual value as in estree mode we want
|
||||
// the stmt to have the real value and not the raw value
|
||||
directive.value.value = value;
|
||||
|
||||
return directive;
|
||||
}
|
||||
|
||||
parseBlockBody(
|
||||
node: N.BlockStatementLike,
|
||||
allowDirectives: ?boolean,
|
||||
topLevel: boolean,
|
||||
end: TokenType,
|
||||
): void {
|
||||
super.parseBlockBody(node, allowDirectives, topLevel, end);
|
||||
|
||||
const directiveStatements = node.directives.map(d =>
|
||||
this.directiveToStmt(d),
|
||||
);
|
||||
node.body = directiveStatements.concat(node.body);
|
||||
delete node.directives;
|
||||
}
|
||||
|
||||
pushClassMethod(
|
||||
classBody: N.ClassBody,
|
||||
method: N.ClassMethod,
|
||||
isGenerator: boolean,
|
||||
isAsync: boolean,
|
||||
isConstructor: boolean,
|
||||
): void {
|
||||
this.parseMethod(
|
||||
method,
|
||||
isGenerator,
|
||||
isAsync,
|
||||
isConstructor,
|
||||
"MethodDefinition",
|
||||
);
|
||||
if (method.typeParameters) {
|
||||
// $FlowIgnore
|
||||
method.value.typeParameters = method.typeParameters;
|
||||
delete method.typeParameters;
|
||||
}
|
||||
classBody.body.push(method);
|
||||
}
|
||||
|
||||
parseExprAtom(refShorthandDefaultPos?: ?Pos): N.Expression {
|
||||
switch (this.state.type) {
|
||||
case tt.regexp:
|
||||
return this.estreeParseRegExpLiteral(this.state.value);
|
||||
|
||||
case tt.num:
|
||||
case tt.string:
|
||||
return this.estreeParseLiteral(this.state.value);
|
||||
|
||||
case tt._null:
|
||||
return this.estreeParseLiteral(null);
|
||||
|
||||
case tt._true:
|
||||
return this.estreeParseLiteral(true);
|
||||
|
||||
case tt._false:
|
||||
return this.estreeParseLiteral(false);
|
||||
|
||||
default:
|
||||
return super.parseExprAtom(refShorthandDefaultPos);
|
||||
}
|
||||
}
|
||||
|
||||
parseLiteral<T: N.Literal>(
|
||||
value: any,
|
||||
type: /*T["kind"]*/ string,
|
||||
startPos?: number,
|
||||
startLoc?: Position,
|
||||
): T {
|
||||
const node = super.parseLiteral(value, type, startPos, startLoc);
|
||||
node.raw = node.extra.raw;
|
||||
delete node.extra;
|
||||
|
||||
return node;
|
||||
}
|
||||
|
||||
parseFunctionBody(node: N.Function, allowExpression: ?boolean): void {
|
||||
super.parseFunctionBody(node, allowExpression);
|
||||
node.expression = node.body.type !== "BlockStatement";
|
||||
}
|
||||
|
||||
parseMethod<T: N.MethodLike>(
|
||||
node: T,
|
||||
isGenerator: boolean,
|
||||
isAsync: boolean,
|
||||
isConstructor: boolean,
|
||||
type: string,
|
||||
): T {
|
||||
let funcNode = this.startNode();
|
||||
funcNode.kind = node.kind; // provide kind, so super method correctly sets state
|
||||
funcNode = super.parseMethod(
|
||||
funcNode,
|
||||
isGenerator,
|
||||
isAsync,
|
||||
isConstructor,
|
||||
"FunctionExpression",
|
||||
);
|
||||
delete funcNode.kind;
|
||||
// $FlowIgnore
|
||||
node.value = funcNode;
|
||||
|
||||
return this.finishNode(node, type);
|
||||
}
|
||||
|
||||
parseObjectMethod(
|
||||
prop: N.ObjectMethod,
|
||||
isGenerator: boolean,
|
||||
isAsync: boolean,
|
||||
isPattern: boolean,
|
||||
containsEsc: boolean,
|
||||
): ?N.ObjectMethod {
|
||||
const node: N.EstreeProperty = (super.parseObjectMethod(
|
||||
prop,
|
||||
isGenerator,
|
||||
isAsync,
|
||||
isPattern,
|
||||
containsEsc,
|
||||
): any);
|
||||
|
||||
if (node) {
|
||||
node.type = "Property";
|
||||
if (node.kind === "method") node.kind = "init";
|
||||
node.shorthand = false;
|
||||
}
|
||||
|
||||
return (node: any);
|
||||
}
|
||||
|
||||
parseObjectProperty(
|
||||
prop: N.ObjectProperty,
|
||||
startPos: ?number,
|
||||
startLoc: ?Position,
|
||||
isPattern: boolean,
|
||||
refShorthandDefaultPos: ?Pos,
|
||||
): ?N.ObjectProperty {
|
||||
const node: N.EstreeProperty = (super.parseObjectProperty(
|
||||
prop,
|
||||
startPos,
|
||||
startLoc,
|
||||
isPattern,
|
||||
refShorthandDefaultPos,
|
||||
): any);
|
||||
|
||||
if (node) {
|
||||
node.kind = "init";
|
||||
node.type = "Property";
|
||||
}
|
||||
|
||||
return (node: any);
|
||||
}
|
||||
|
||||
toAssignable(
|
||||
node: N.Node,
|
||||
isBinding: ?boolean,
|
||||
contextDescription: string,
|
||||
): N.Node {
|
||||
if (isSimpleProperty(node)) {
|
||||
this.toAssignable(node.value, isBinding, contextDescription);
|
||||
|
||||
return node;
|
||||
}
|
||||
|
||||
return super.toAssignable(node, isBinding, contextDescription);
|
||||
}
|
||||
|
||||
toAssignableObjectExpressionProp(
|
||||
prop: N.Node,
|
||||
isBinding: ?boolean,
|
||||
isLast: boolean,
|
||||
) {
|
||||
if (prop.kind === "get" || prop.kind === "set") {
|
||||
this.raise(
|
||||
prop.key.start,
|
||||
"Object pattern can't contain getter or setter",
|
||||
);
|
||||
} else if (prop.method) {
|
||||
this.raise(prop.key.start, "Object pattern can't contain methods");
|
||||
} else {
|
||||
super.toAssignableObjectExpressionProp(prop, isBinding, isLast);
|
||||
}
|
||||
}
|
||||
};
|
||||
2590
packages/babel-parser/src/plugins/flow.js
Normal file
2590
packages/babel-parser/src/plugins/flow.js
Normal file
File diff suppressed because it is too large
Load Diff
561
packages/babel-parser/src/plugins/jsx/index.js
Normal file
561
packages/babel-parser/src/plugins/jsx/index.js
Normal file
@@ -0,0 +1,561 @@
|
||||
// @flow
|
||||
|
||||
import * as charCodes from "charcodes";
|
||||
|
||||
import XHTMLEntities from "./xhtml";
|
||||
import type Parser from "../../parser";
|
||||
import { TokenType, types as tt } from "../../tokenizer/types";
|
||||
import { TokContext, types as tc } from "../../tokenizer/context";
|
||||
import * as N from "../../types";
|
||||
import { isIdentifierChar, isIdentifierStart } from "../../util/identifier";
|
||||
import type { Pos, Position } from "../../util/location";
|
||||
import { isNewLine } from "../../util/whitespace";
|
||||
|
||||
const HEX_NUMBER = /^[\da-fA-F]+$/;
|
||||
const DECIMAL_NUMBER = /^\d+$/;
|
||||
|
||||
tc.j_oTag = new TokContext("<tag", false);
|
||||
tc.j_cTag = new TokContext("</tag", false);
|
||||
tc.j_expr = new TokContext("<tag>...</tag>", true, true);
|
||||
|
||||
tt.jsxName = new TokenType("jsxName");
|
||||
tt.jsxText = new TokenType("jsxText", { beforeExpr: true });
|
||||
tt.jsxTagStart = new TokenType("jsxTagStart", { startsExpr: true });
|
||||
tt.jsxTagEnd = new TokenType("jsxTagEnd");
|
||||
|
||||
tt.jsxTagStart.updateContext = function() {
|
||||
this.state.context.push(tc.j_expr); // treat as beginning of JSX expression
|
||||
this.state.context.push(tc.j_oTag); // start opening tag context
|
||||
this.state.exprAllowed = false;
|
||||
};
|
||||
|
||||
tt.jsxTagEnd.updateContext = function(prevType) {
|
||||
const out = this.state.context.pop();
|
||||
if ((out === tc.j_oTag && prevType === tt.slash) || out === tc.j_cTag) {
|
||||
this.state.context.pop();
|
||||
this.state.exprAllowed = this.curContext() === tc.j_expr;
|
||||
} else {
|
||||
this.state.exprAllowed = true;
|
||||
}
|
||||
};
|
||||
|
||||
function isFragment(object: ?N.JSXElement): boolean {
|
||||
return object
|
||||
? object.type === "JSXOpeningFragment" ||
|
||||
object.type === "JSXClosingFragment"
|
||||
: false;
|
||||
}
|
||||
|
||||
// Transforms JSX element name to string.
|
||||
|
||||
function getQualifiedJSXName(
|
||||
object: N.JSXIdentifier | N.JSXNamespacedName | N.JSXMemberExpression,
|
||||
): string {
|
||||
if (object.type === "JSXIdentifier") {
|
||||
return object.name;
|
||||
}
|
||||
|
||||
if (object.type === "JSXNamespacedName") {
|
||||
return object.namespace.name + ":" + object.name.name;
|
||||
}
|
||||
|
||||
if (object.type === "JSXMemberExpression") {
|
||||
return (
|
||||
getQualifiedJSXName(object.object) +
|
||||
"." +
|
||||
getQualifiedJSXName(object.property)
|
||||
);
|
||||
}
|
||||
|
||||
// istanbul ignore next
|
||||
throw new Error("Node had unexpected type: " + object.type);
|
||||
}
|
||||
|
||||
export default (superClass: Class<Parser>): Class<Parser> =>
|
||||
class extends superClass {
|
||||
// Reads inline JSX contents token.
|
||||
|
||||
jsxReadToken(): void {
|
||||
let out = "";
|
||||
let chunkStart = this.state.pos;
|
||||
for (;;) {
|
||||
if (this.state.pos >= this.input.length) {
|
||||
this.raise(this.state.start, "Unterminated JSX contents");
|
||||
}
|
||||
|
||||
const ch = this.input.charCodeAt(this.state.pos);
|
||||
|
||||
switch (ch) {
|
||||
case charCodes.lessThan:
|
||||
case charCodes.leftCurlyBrace:
|
||||
if (this.state.pos === this.state.start) {
|
||||
if (ch === charCodes.lessThan && this.state.exprAllowed) {
|
||||
++this.state.pos;
|
||||
return this.finishToken(tt.jsxTagStart);
|
||||
}
|
||||
return this.getTokenFromCode(ch);
|
||||
}
|
||||
out += this.input.slice(chunkStart, this.state.pos);
|
||||
return this.finishToken(tt.jsxText, out);
|
||||
|
||||
case charCodes.ampersand:
|
||||
out += this.input.slice(chunkStart, this.state.pos);
|
||||
out += this.jsxReadEntity();
|
||||
chunkStart = this.state.pos;
|
||||
break;
|
||||
|
||||
default:
|
||||
if (isNewLine(ch)) {
|
||||
out += this.input.slice(chunkStart, this.state.pos);
|
||||
out += this.jsxReadNewLine(true);
|
||||
chunkStart = this.state.pos;
|
||||
} else {
|
||||
++this.state.pos;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
jsxReadNewLine(normalizeCRLF: boolean): string {
|
||||
const ch = this.input.charCodeAt(this.state.pos);
|
||||
let out;
|
||||
++this.state.pos;
|
||||
if (
|
||||
ch === charCodes.carriageReturn &&
|
||||
this.input.charCodeAt(this.state.pos) === charCodes.lineFeed
|
||||
) {
|
||||
++this.state.pos;
|
||||
out = normalizeCRLF ? "\n" : "\r\n";
|
||||
} else {
|
||||
out = String.fromCharCode(ch);
|
||||
}
|
||||
++this.state.curLine;
|
||||
this.state.lineStart = this.state.pos;
|
||||
|
||||
return out;
|
||||
}
|
||||
|
||||
jsxReadString(quote: number): void {
|
||||
let out = "";
|
||||
let chunkStart = ++this.state.pos;
|
||||
for (;;) {
|
||||
if (this.state.pos >= this.input.length) {
|
||||
this.raise(this.state.start, "Unterminated string constant");
|
||||
}
|
||||
|
||||
const ch = this.input.charCodeAt(this.state.pos);
|
||||
if (ch === quote) break;
|
||||
if (ch === charCodes.ampersand) {
|
||||
out += this.input.slice(chunkStart, this.state.pos);
|
||||
out += this.jsxReadEntity();
|
||||
chunkStart = this.state.pos;
|
||||
} else if (isNewLine(ch)) {
|
||||
out += this.input.slice(chunkStart, this.state.pos);
|
||||
out += this.jsxReadNewLine(false);
|
||||
chunkStart = this.state.pos;
|
||||
} else {
|
||||
++this.state.pos;
|
||||
}
|
||||
}
|
||||
out += this.input.slice(chunkStart, this.state.pos++);
|
||||
return this.finishToken(tt.string, out);
|
||||
}
|
||||
|
||||
jsxReadEntity(): string {
|
||||
let str = "";
|
||||
let count = 0;
|
||||
let entity;
|
||||
let ch = this.input[this.state.pos];
|
||||
|
||||
const startPos = ++this.state.pos;
|
||||
while (this.state.pos < this.input.length && count++ < 10) {
|
||||
ch = this.input[this.state.pos++];
|
||||
if (ch === ";") {
|
||||
if (str[0] === "#") {
|
||||
if (str[1] === "x") {
|
||||
str = str.substr(2);
|
||||
if (HEX_NUMBER.test(str)) {
|
||||
entity = String.fromCodePoint(parseInt(str, 16));
|
||||
}
|
||||
} else {
|
||||
str = str.substr(1);
|
||||
if (DECIMAL_NUMBER.test(str)) {
|
||||
entity = String.fromCodePoint(parseInt(str, 10));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
entity = XHTMLEntities[str];
|
||||
}
|
||||
break;
|
||||
}
|
||||
str += ch;
|
||||
}
|
||||
if (!entity) {
|
||||
this.state.pos = startPos;
|
||||
return "&";
|
||||
}
|
||||
return entity;
|
||||
}
|
||||
|
||||
// Read a JSX identifier (valid tag or attribute name).
|
||||
//
|
||||
// Optimized version since JSX identifiers can"t contain
|
||||
// escape characters and so can be read as single slice.
|
||||
// Also assumes that first character was already checked
|
||||
// by isIdentifierStart in readToken.
|
||||
|
||||
jsxReadWord(): void {
|
||||
let ch;
|
||||
const start = this.state.pos;
|
||||
do {
|
||||
ch = this.input.charCodeAt(++this.state.pos);
|
||||
} while (isIdentifierChar(ch) || ch === charCodes.dash);
|
||||
return this.finishToken(
|
||||
tt.jsxName,
|
||||
this.input.slice(start, this.state.pos),
|
||||
);
|
||||
}
|
||||
|
||||
// Parse next token as JSX identifier
|
||||
|
||||
jsxParseIdentifier(): N.JSXIdentifier {
|
||||
const node = this.startNode();
|
||||
if (this.match(tt.jsxName)) {
|
||||
node.name = this.state.value;
|
||||
} else if (this.state.type.keyword) {
|
||||
node.name = this.state.type.keyword;
|
||||
} else {
|
||||
this.unexpected();
|
||||
}
|
||||
this.next();
|
||||
return this.finishNode(node, "JSXIdentifier");
|
||||
}
|
||||
|
||||
// Parse namespaced identifier.
|
||||
|
||||
jsxParseNamespacedName(): N.JSXNamespacedName {
|
||||
const startPos = this.state.start;
|
||||
const startLoc = this.state.startLoc;
|
||||
const name = this.jsxParseIdentifier();
|
||||
if (!this.eat(tt.colon)) return name;
|
||||
|
||||
const node = this.startNodeAt(startPos, startLoc);
|
||||
node.namespace = name;
|
||||
node.name = this.jsxParseIdentifier();
|
||||
return this.finishNode(node, "JSXNamespacedName");
|
||||
}
|
||||
|
||||
// Parses element name in any form - namespaced, member
|
||||
// or single identifier.
|
||||
|
||||
jsxParseElementName(): N.JSXNamespacedName | N.JSXMemberExpression {
|
||||
const startPos = this.state.start;
|
||||
const startLoc = this.state.startLoc;
|
||||
let node = this.jsxParseNamespacedName();
|
||||
while (this.eat(tt.dot)) {
|
||||
const newNode = this.startNodeAt(startPos, startLoc);
|
||||
newNode.object = node;
|
||||
newNode.property = this.jsxParseIdentifier();
|
||||
node = this.finishNode(newNode, "JSXMemberExpression");
|
||||
}
|
||||
return node;
|
||||
}
|
||||
|
||||
// Parses any type of JSX attribute value.
|
||||
|
||||
jsxParseAttributeValue(): N.Expression {
|
||||
let node;
|
||||
switch (this.state.type) {
|
||||
case tt.braceL:
|
||||
node = this.jsxParseExpressionContainer();
|
||||
if (node.expression.type === "JSXEmptyExpression") {
|
||||
throw this.raise(
|
||||
node.start,
|
||||
"JSX attributes must only be assigned a non-empty expression",
|
||||
);
|
||||
} else {
|
||||
return node;
|
||||
}
|
||||
|
||||
case tt.jsxTagStart:
|
||||
case tt.string:
|
||||
return this.parseExprAtom();
|
||||
|
||||
default:
|
||||
throw this.raise(
|
||||
this.state.start,
|
||||
"JSX value should be either an expression or a quoted JSX text",
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// JSXEmptyExpression is unique type since it doesn't actually parse anything,
|
||||
// and so it should start at the end of last read token (left brace) and finish
|
||||
// at the beginning of the next one (right brace).
|
||||
|
||||
jsxParseEmptyExpression(): N.JSXEmptyExpression {
|
||||
const node = this.startNodeAt(
|
||||
this.state.lastTokEnd,
|
||||
this.state.lastTokEndLoc,
|
||||
);
|
||||
return this.finishNodeAt(
|
||||
node,
|
||||
"JSXEmptyExpression",
|
||||
this.state.start,
|
||||
this.state.startLoc,
|
||||
);
|
||||
}
|
||||
|
||||
// Parse JSX spread child
|
||||
|
||||
jsxParseSpreadChild(): N.JSXSpreadChild {
|
||||
const node = this.startNode();
|
||||
this.expect(tt.braceL);
|
||||
this.expect(tt.ellipsis);
|
||||
node.expression = this.parseExpression();
|
||||
this.expect(tt.braceR);
|
||||
|
||||
return this.finishNode(node, "JSXSpreadChild");
|
||||
}
|
||||
|
||||
// Parses JSX expression enclosed into curly brackets.
|
||||
|
||||
jsxParseExpressionContainer(): N.JSXExpressionContainer {
|
||||
const node = this.startNode();
|
||||
this.next();
|
||||
if (this.match(tt.braceR)) {
|
||||
node.expression = this.jsxParseEmptyExpression();
|
||||
} else {
|
||||
node.expression = this.parseExpression();
|
||||
}
|
||||
this.expect(tt.braceR);
|
||||
return this.finishNode(node, "JSXExpressionContainer");
|
||||
}
|
||||
|
||||
// Parses following JSX attribute name-value pair.
|
||||
|
||||
jsxParseAttribute(): N.JSXAttribute {
|
||||
const node = this.startNode();
|
||||
if (this.eat(tt.braceL)) {
|
||||
this.expect(tt.ellipsis);
|
||||
node.argument = this.parseMaybeAssign();
|
||||
this.expect(tt.braceR);
|
||||
return this.finishNode(node, "JSXSpreadAttribute");
|
||||
}
|
||||
node.name = this.jsxParseNamespacedName();
|
||||
node.value = this.eat(tt.eq) ? this.jsxParseAttributeValue() : null;
|
||||
return this.finishNode(node, "JSXAttribute");
|
||||
}
|
||||
|
||||
// Parses JSX opening tag starting after "<".
|
||||
|
||||
jsxParseOpeningElementAt(
|
||||
startPos: number,
|
||||
startLoc: Position,
|
||||
): N.JSXOpeningElement {
|
||||
const node = this.startNodeAt(startPos, startLoc);
|
||||
if (this.match(tt.jsxTagEnd)) {
|
||||
this.expect(tt.jsxTagEnd);
|
||||
return this.finishNode(node, "JSXOpeningFragment");
|
||||
}
|
||||
node.attributes = [];
|
||||
node.name = this.jsxParseElementName();
|
||||
while (!this.match(tt.slash) && !this.match(tt.jsxTagEnd)) {
|
||||
node.attributes.push(this.jsxParseAttribute());
|
||||
}
|
||||
node.selfClosing = this.eat(tt.slash);
|
||||
this.expect(tt.jsxTagEnd);
|
||||
return this.finishNode(node, "JSXOpeningElement");
|
||||
}
|
||||
|
||||
// Parses JSX closing tag starting after "</".
|
||||
|
||||
jsxParseClosingElementAt(
|
||||
startPos: number,
|
||||
startLoc: Position,
|
||||
): N.JSXClosingElement {
|
||||
const node = this.startNodeAt(startPos, startLoc);
|
||||
if (this.match(tt.jsxTagEnd)) {
|
||||
this.expect(tt.jsxTagEnd);
|
||||
return this.finishNode(node, "JSXClosingFragment");
|
||||
}
|
||||
node.name = this.jsxParseElementName();
|
||||
this.expect(tt.jsxTagEnd);
|
||||
return this.finishNode(node, "JSXClosingElement");
|
||||
}
|
||||
|
||||
// Parses entire JSX element, including it"s opening tag
|
||||
// (starting after "<"), attributes, contents and closing tag.
|
||||
|
||||
jsxParseElementAt(startPos: number, startLoc: Position): N.JSXElement {
|
||||
const node = this.startNodeAt(startPos, startLoc);
|
||||
const children = [];
|
||||
const openingElement = this.jsxParseOpeningElementAt(startPos, startLoc);
|
||||
let closingElement = null;
|
||||
|
||||
if (!openingElement.selfClosing) {
|
||||
contents: for (;;) {
|
||||
switch (this.state.type) {
|
||||
case tt.jsxTagStart:
|
||||
startPos = this.state.start;
|
||||
startLoc = this.state.startLoc;
|
||||
this.next();
|
||||
if (this.eat(tt.slash)) {
|
||||
closingElement = this.jsxParseClosingElementAt(
|
||||
startPos,
|
||||
startLoc,
|
||||
);
|
||||
break contents;
|
||||
}
|
||||
children.push(this.jsxParseElementAt(startPos, startLoc));
|
||||
break;
|
||||
|
||||
case tt.jsxText:
|
||||
children.push(this.parseExprAtom());
|
||||
break;
|
||||
|
||||
case tt.braceL:
|
||||
if (this.lookahead().type === tt.ellipsis) {
|
||||
children.push(this.jsxParseSpreadChild());
|
||||
} else {
|
||||
children.push(this.jsxParseExpressionContainer());
|
||||
}
|
||||
|
||||
break;
|
||||
|
||||
// istanbul ignore next - should never happen
|
||||
default:
|
||||
throw this.unexpected();
|
||||
}
|
||||
}
|
||||
|
||||
if (isFragment(openingElement) && !isFragment(closingElement)) {
|
||||
this.raise(
|
||||
// $FlowIgnore
|
||||
closingElement.start,
|
||||
"Expected corresponding JSX closing tag for <>",
|
||||
);
|
||||
} else if (!isFragment(openingElement) && isFragment(closingElement)) {
|
||||
this.raise(
|
||||
// $FlowIgnore
|
||||
closingElement.start,
|
||||
"Expected corresponding JSX closing tag for <" +
|
||||
getQualifiedJSXName(openingElement.name) +
|
||||
">",
|
||||
);
|
||||
} else if (!isFragment(openingElement) && !isFragment(closingElement)) {
|
||||
if (
|
||||
// $FlowIgnore
|
||||
getQualifiedJSXName(closingElement.name) !==
|
||||
getQualifiedJSXName(openingElement.name)
|
||||
) {
|
||||
this.raise(
|
||||
// $FlowIgnore
|
||||
closingElement.start,
|
||||
"Expected corresponding JSX closing tag for <" +
|
||||
getQualifiedJSXName(openingElement.name) +
|
||||
">",
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (isFragment(openingElement)) {
|
||||
node.openingFragment = openingElement;
|
||||
node.closingFragment = closingElement;
|
||||
} else {
|
||||
node.openingElement = openingElement;
|
||||
node.closingElement = closingElement;
|
||||
}
|
||||
node.children = children;
|
||||
if (this.match(tt.relational) && this.state.value === "<") {
|
||||
this.raise(
|
||||
this.state.start,
|
||||
"Adjacent JSX elements must be wrapped in an enclosing tag. " +
|
||||
"Did you want a JSX fragment <>...</>?",
|
||||
);
|
||||
}
|
||||
|
||||
return isFragment(openingElement)
|
||||
? this.finishNode(node, "JSXFragment")
|
||||
: this.finishNode(node, "JSXElement");
|
||||
}
|
||||
|
||||
// Parses entire JSX element from current position.
|
||||
|
||||
jsxParseElement(): N.JSXElement {
|
||||
const startPos = this.state.start;
|
||||
const startLoc = this.state.startLoc;
|
||||
this.next();
|
||||
return this.jsxParseElementAt(startPos, startLoc);
|
||||
}
|
||||
|
||||
// ==================================
|
||||
// Overrides
|
||||
// ==================================
|
||||
|
||||
parseExprAtom(refShortHandDefaultPos: ?Pos): N.Expression {
|
||||
if (this.match(tt.jsxText)) {
|
||||
return this.parseLiteral(this.state.value, "JSXText");
|
||||
} else if (this.match(tt.jsxTagStart)) {
|
||||
return this.jsxParseElement();
|
||||
} else {
|
||||
return super.parseExprAtom(refShortHandDefaultPos);
|
||||
}
|
||||
}
|
||||
|
||||
readToken(code: number): void {
|
||||
if (this.state.inPropertyName) return super.readToken(code);
|
||||
|
||||
const context = this.curContext();
|
||||
|
||||
if (context === tc.j_expr) {
|
||||
return this.jsxReadToken();
|
||||
}
|
||||
|
||||
if (context === tc.j_oTag || context === tc.j_cTag) {
|
||||
if (isIdentifierStart(code)) {
|
||||
return this.jsxReadWord();
|
||||
}
|
||||
|
||||
if (code === charCodes.greaterThan) {
|
||||
++this.state.pos;
|
||||
return this.finishToken(tt.jsxTagEnd);
|
||||
}
|
||||
|
||||
if (
|
||||
(code === charCodes.quotationMark || code === charCodes.apostrophe) &&
|
||||
context === tc.j_oTag
|
||||
) {
|
||||
return this.jsxReadString(code);
|
||||
}
|
||||
}
|
||||
|
||||
if (code === charCodes.lessThan && this.state.exprAllowed) {
|
||||
++this.state.pos;
|
||||
return this.finishToken(tt.jsxTagStart);
|
||||
}
|
||||
|
||||
return super.readToken(code);
|
||||
}
|
||||
|
||||
updateContext(prevType: TokenType): void {
|
||||
if (this.match(tt.braceL)) {
|
||||
const curContext = this.curContext();
|
||||
if (curContext === tc.j_oTag) {
|
||||
this.state.context.push(tc.braceExpression);
|
||||
} else if (curContext === tc.j_expr) {
|
||||
this.state.context.push(tc.templateQuasi);
|
||||
} else {
|
||||
super.updateContext(prevType);
|
||||
}
|
||||
this.state.exprAllowed = true;
|
||||
} else if (this.match(tt.slash) && prevType === tt.jsxTagStart) {
|
||||
this.state.context.length -= 2; // do not consider JSX expr -> JSX open tag -> ... anymore
|
||||
this.state.context.push(tc.j_cTag); // reconsider as closing tag context
|
||||
this.state.exprAllowed = false;
|
||||
} else {
|
||||
return super.updateContext(prevType);
|
||||
}
|
||||
}
|
||||
};
|
||||
258
packages/babel-parser/src/plugins/jsx/xhtml.js
Normal file
258
packages/babel-parser/src/plugins/jsx/xhtml.js
Normal file
@@ -0,0 +1,258 @@
|
||||
// @flow
|
||||
|
||||
const entities: { [name: string]: string } = {
|
||||
quot: "\u0022",
|
||||
amp: "&",
|
||||
apos: "\u0027",
|
||||
lt: "<",
|
||||
gt: ">",
|
||||
nbsp: "\u00A0",
|
||||
iexcl: "\u00A1",
|
||||
cent: "\u00A2",
|
||||
pound: "\u00A3",
|
||||
curren: "\u00A4",
|
||||
yen: "\u00A5",
|
||||
brvbar: "\u00A6",
|
||||
sect: "\u00A7",
|
||||
uml: "\u00A8",
|
||||
copy: "\u00A9",
|
||||
ordf: "\u00AA",
|
||||
laquo: "\u00AB",
|
||||
not: "\u00AC",
|
||||
shy: "\u00AD",
|
||||
reg: "\u00AE",
|
||||
macr: "\u00AF",
|
||||
deg: "\u00B0",
|
||||
plusmn: "\u00B1",
|
||||
sup2: "\u00B2",
|
||||
sup3: "\u00B3",
|
||||
acute: "\u00B4",
|
||||
micro: "\u00B5",
|
||||
para: "\u00B6",
|
||||
middot: "\u00B7",
|
||||
cedil: "\u00B8",
|
||||
sup1: "\u00B9",
|
||||
ordm: "\u00BA",
|
||||
raquo: "\u00BB",
|
||||
frac14: "\u00BC",
|
||||
frac12: "\u00BD",
|
||||
frac34: "\u00BE",
|
||||
iquest: "\u00BF",
|
||||
Agrave: "\u00C0",
|
||||
Aacute: "\u00C1",
|
||||
Acirc: "\u00C2",
|
||||
Atilde: "\u00C3",
|
||||
Auml: "\u00C4",
|
||||
Aring: "\u00C5",
|
||||
AElig: "\u00C6",
|
||||
Ccedil: "\u00C7",
|
||||
Egrave: "\u00C8",
|
||||
Eacute: "\u00C9",
|
||||
Ecirc: "\u00CA",
|
||||
Euml: "\u00CB",
|
||||
Igrave: "\u00CC",
|
||||
Iacute: "\u00CD",
|
||||
Icirc: "\u00CE",
|
||||
Iuml: "\u00CF",
|
||||
ETH: "\u00D0",
|
||||
Ntilde: "\u00D1",
|
||||
Ograve: "\u00D2",
|
||||
Oacute: "\u00D3",
|
||||
Ocirc: "\u00D4",
|
||||
Otilde: "\u00D5",
|
||||
Ouml: "\u00D6",
|
||||
times: "\u00D7",
|
||||
Oslash: "\u00D8",
|
||||
Ugrave: "\u00D9",
|
||||
Uacute: "\u00DA",
|
||||
Ucirc: "\u00DB",
|
||||
Uuml: "\u00DC",
|
||||
Yacute: "\u00DD",
|
||||
THORN: "\u00DE",
|
||||
szlig: "\u00DF",
|
||||
agrave: "\u00E0",
|
||||
aacute: "\u00E1",
|
||||
acirc: "\u00E2",
|
||||
atilde: "\u00E3",
|
||||
auml: "\u00E4",
|
||||
aring: "\u00E5",
|
||||
aelig: "\u00E6",
|
||||
ccedil: "\u00E7",
|
||||
egrave: "\u00E8",
|
||||
eacute: "\u00E9",
|
||||
ecirc: "\u00EA",
|
||||
euml: "\u00EB",
|
||||
igrave: "\u00EC",
|
||||
iacute: "\u00ED",
|
||||
icirc: "\u00EE",
|
||||
iuml: "\u00EF",
|
||||
eth: "\u00F0",
|
||||
ntilde: "\u00F1",
|
||||
ograve: "\u00F2",
|
||||
oacute: "\u00F3",
|
||||
ocirc: "\u00F4",
|
||||
otilde: "\u00F5",
|
||||
ouml: "\u00F6",
|
||||
divide: "\u00F7",
|
||||
oslash: "\u00F8",
|
||||
ugrave: "\u00F9",
|
||||
uacute: "\u00FA",
|
||||
ucirc: "\u00FB",
|
||||
uuml: "\u00FC",
|
||||
yacute: "\u00FD",
|
||||
thorn: "\u00FE",
|
||||
yuml: "\u00FF",
|
||||
OElig: "\u0152",
|
||||
oelig: "\u0153",
|
||||
Scaron: "\u0160",
|
||||
scaron: "\u0161",
|
||||
Yuml: "\u0178",
|
||||
fnof: "\u0192",
|
||||
circ: "\u02C6",
|
||||
tilde: "\u02DC",
|
||||
Alpha: "\u0391",
|
||||
Beta: "\u0392",
|
||||
Gamma: "\u0393",
|
||||
Delta: "\u0394",
|
||||
Epsilon: "\u0395",
|
||||
Zeta: "\u0396",
|
||||
Eta: "\u0397",
|
||||
Theta: "\u0398",
|
||||
Iota: "\u0399",
|
||||
Kappa: "\u039A",
|
||||
Lambda: "\u039B",
|
||||
Mu: "\u039C",
|
||||
Nu: "\u039D",
|
||||
Xi: "\u039E",
|
||||
Omicron: "\u039F",
|
||||
Pi: "\u03A0",
|
||||
Rho: "\u03A1",
|
||||
Sigma: "\u03A3",
|
||||
Tau: "\u03A4",
|
||||
Upsilon: "\u03A5",
|
||||
Phi: "\u03A6",
|
||||
Chi: "\u03A7",
|
||||
Psi: "\u03A8",
|
||||
Omega: "\u03A9",
|
||||
alpha: "\u03B1",
|
||||
beta: "\u03B2",
|
||||
gamma: "\u03B3",
|
||||
delta: "\u03B4",
|
||||
epsilon: "\u03B5",
|
||||
zeta: "\u03B6",
|
||||
eta: "\u03B7",
|
||||
theta: "\u03B8",
|
||||
iota: "\u03B9",
|
||||
kappa: "\u03BA",
|
||||
lambda: "\u03BB",
|
||||
mu: "\u03BC",
|
||||
nu: "\u03BD",
|
||||
xi: "\u03BE",
|
||||
omicron: "\u03BF",
|
||||
pi: "\u03C0",
|
||||
rho: "\u03C1",
|
||||
sigmaf: "\u03C2",
|
||||
sigma: "\u03C3",
|
||||
tau: "\u03C4",
|
||||
upsilon: "\u03C5",
|
||||
phi: "\u03C6",
|
||||
chi: "\u03C7",
|
||||
psi: "\u03C8",
|
||||
omega: "\u03C9",
|
||||
thetasym: "\u03D1",
|
||||
upsih: "\u03D2",
|
||||
piv: "\u03D6",
|
||||
ensp: "\u2002",
|
||||
emsp: "\u2003",
|
||||
thinsp: "\u2009",
|
||||
zwnj: "\u200C",
|
||||
zwj: "\u200D",
|
||||
lrm: "\u200E",
|
||||
rlm: "\u200F",
|
||||
ndash: "\u2013",
|
||||
mdash: "\u2014",
|
||||
lsquo: "\u2018",
|
||||
rsquo: "\u2019",
|
||||
sbquo: "\u201A",
|
||||
ldquo: "\u201C",
|
||||
rdquo: "\u201D",
|
||||
bdquo: "\u201E",
|
||||
dagger: "\u2020",
|
||||
Dagger: "\u2021",
|
||||
bull: "\u2022",
|
||||
hellip: "\u2026",
|
||||
permil: "\u2030",
|
||||
prime: "\u2032",
|
||||
Prime: "\u2033",
|
||||
lsaquo: "\u2039",
|
||||
rsaquo: "\u203A",
|
||||
oline: "\u203E",
|
||||
frasl: "\u2044",
|
||||
euro: "\u20AC",
|
||||
image: "\u2111",
|
||||
weierp: "\u2118",
|
||||
real: "\u211C",
|
||||
trade: "\u2122",
|
||||
alefsym: "\u2135",
|
||||
larr: "\u2190",
|
||||
uarr: "\u2191",
|
||||
rarr: "\u2192",
|
||||
darr: "\u2193",
|
||||
harr: "\u2194",
|
||||
crarr: "\u21B5",
|
||||
lArr: "\u21D0",
|
||||
uArr: "\u21D1",
|
||||
rArr: "\u21D2",
|
||||
dArr: "\u21D3",
|
||||
hArr: "\u21D4",
|
||||
forall: "\u2200",
|
||||
part: "\u2202",
|
||||
exist: "\u2203",
|
||||
empty: "\u2205",
|
||||
nabla: "\u2207",
|
||||
isin: "\u2208",
|
||||
notin: "\u2209",
|
||||
ni: "\u220B",
|
||||
prod: "\u220F",
|
||||
sum: "\u2211",
|
||||
minus: "\u2212",
|
||||
lowast: "\u2217",
|
||||
radic: "\u221A",
|
||||
prop: "\u221D",
|
||||
infin: "\u221E",
|
||||
ang: "\u2220",
|
||||
and: "\u2227",
|
||||
or: "\u2228",
|
||||
cap: "\u2229",
|
||||
cup: "\u222A",
|
||||
int: "\u222B",
|
||||
there4: "\u2234",
|
||||
sim: "\u223C",
|
||||
cong: "\u2245",
|
||||
asymp: "\u2248",
|
||||
ne: "\u2260",
|
||||
equiv: "\u2261",
|
||||
le: "\u2264",
|
||||
ge: "\u2265",
|
||||
sub: "\u2282",
|
||||
sup: "\u2283",
|
||||
nsub: "\u2284",
|
||||
sube: "\u2286",
|
||||
supe: "\u2287",
|
||||
oplus: "\u2295",
|
||||
otimes: "\u2297",
|
||||
perp: "\u22A5",
|
||||
sdot: "\u22C5",
|
||||
lceil: "\u2308",
|
||||
rceil: "\u2309",
|
||||
lfloor: "\u230A",
|
||||
rfloor: "\u230B",
|
||||
lang: "\u2329",
|
||||
rang: "\u232A",
|
||||
loz: "\u25CA",
|
||||
spades: "\u2660",
|
||||
clubs: "\u2663",
|
||||
hearts: "\u2665",
|
||||
diams: "\u2666",
|
||||
};
|
||||
export default entities;
|
||||
2099
packages/babel-parser/src/plugins/typescript.js
Normal file
2099
packages/babel-parser/src/plugins/typescript.js
Normal file
File diff suppressed because it is too large
Load Diff
124
packages/babel-parser/src/tokenizer/context.js
Normal file
124
packages/babel-parser/src/tokenizer/context.js
Normal file
@@ -0,0 +1,124 @@
|
||||
// @flow
|
||||
|
||||
// The algorithm used to determine whether a regexp can appear at a
|
||||
// given point in the program is loosely based on sweet.js' approach.
|
||||
// See https://github.com/mozilla/sweet.js/wiki/design
|
||||
|
||||
import { types as tt } from "./types";
|
||||
import { lineBreak } from "../util/whitespace";
|
||||
|
||||
export class TokContext {
|
||||
constructor(
|
||||
token: string,
|
||||
isExpr?: boolean,
|
||||
preserveSpace?: boolean,
|
||||
override?: Function, // Takes a Tokenizer as a this-parameter, and returns void.
|
||||
) {
|
||||
this.token = token;
|
||||
this.isExpr = !!isExpr;
|
||||
this.preserveSpace = !!preserveSpace;
|
||||
this.override = override;
|
||||
}
|
||||
|
||||
token: string;
|
||||
isExpr: boolean;
|
||||
preserveSpace: boolean;
|
||||
override: ?Function;
|
||||
}
|
||||
|
||||
export const types: {
|
||||
[key: string]: TokContext,
|
||||
} = {
|
||||
braceStatement: new TokContext("{", false),
|
||||
braceExpression: new TokContext("{", true),
|
||||
templateQuasi: new TokContext("${", true),
|
||||
parenStatement: new TokContext("(", false),
|
||||
parenExpression: new TokContext("(", true),
|
||||
template: new TokContext("`", true, true, p => p.readTmplToken()),
|
||||
functionExpression: new TokContext("function", true),
|
||||
};
|
||||
|
||||
// Token-specific context update code
|
||||
|
||||
tt.parenR.updateContext = tt.braceR.updateContext = function() {
|
||||
if (this.state.context.length === 1) {
|
||||
this.state.exprAllowed = true;
|
||||
return;
|
||||
}
|
||||
|
||||
const out = this.state.context.pop();
|
||||
if (
|
||||
out === types.braceStatement &&
|
||||
this.curContext() === types.functionExpression
|
||||
) {
|
||||
this.state.context.pop();
|
||||
this.state.exprAllowed = false;
|
||||
} else if (out === types.templateQuasi) {
|
||||
this.state.exprAllowed = true;
|
||||
} else {
|
||||
this.state.exprAllowed = !out.isExpr;
|
||||
}
|
||||
};
|
||||
|
||||
tt.name.updateContext = function(prevType) {
|
||||
if (this.state.value === "of" && this.curContext() === types.parenStatement) {
|
||||
this.state.exprAllowed = !prevType.beforeExpr;
|
||||
return;
|
||||
}
|
||||
|
||||
this.state.exprAllowed = false;
|
||||
|
||||
if (prevType === tt._let || prevType === tt._const || prevType === tt._var) {
|
||||
if (lineBreak.test(this.input.slice(this.state.end))) {
|
||||
this.state.exprAllowed = true;
|
||||
}
|
||||
}
|
||||
if (this.state.isIterator) {
|
||||
this.state.isIterator = false;
|
||||
}
|
||||
};
|
||||
|
||||
tt.braceL.updateContext = function(prevType) {
|
||||
this.state.context.push(
|
||||
this.braceIsBlock(prevType) ? types.braceStatement : types.braceExpression,
|
||||
);
|
||||
this.state.exprAllowed = true;
|
||||
};
|
||||
|
||||
tt.dollarBraceL.updateContext = function() {
|
||||
this.state.context.push(types.templateQuasi);
|
||||
this.state.exprAllowed = true;
|
||||
};
|
||||
|
||||
tt.parenL.updateContext = function(prevType) {
|
||||
const statementParens =
|
||||
prevType === tt._if ||
|
||||
prevType === tt._for ||
|
||||
prevType === tt._with ||
|
||||
prevType === tt._while;
|
||||
this.state.context.push(
|
||||
statementParens ? types.parenStatement : types.parenExpression,
|
||||
);
|
||||
this.state.exprAllowed = true;
|
||||
};
|
||||
|
||||
tt.incDec.updateContext = function() {
|
||||
// tokExprAllowed stays unchanged
|
||||
};
|
||||
|
||||
tt._function.updateContext = function(prevType) {
|
||||
if (this.state.exprAllowed && !this.braceIsBlock(prevType)) {
|
||||
this.state.context.push(types.functionExpression);
|
||||
}
|
||||
|
||||
this.state.exprAllowed = false;
|
||||
};
|
||||
|
||||
tt.backQuote.updateContext = function() {
|
||||
if (this.curContext() === types.template) {
|
||||
this.state.context.pop();
|
||||
} else {
|
||||
this.state.context.push(types.template);
|
||||
}
|
||||
this.state.exprAllowed = false;
|
||||
};
|
||||
1367
packages/babel-parser/src/tokenizer/index.js
Normal file
1367
packages/babel-parser/src/tokenizer/index.js
Normal file
File diff suppressed because it is too large
Load Diff
213
packages/babel-parser/src/tokenizer/state.js
Normal file
213
packages/babel-parser/src/tokenizer/state.js
Normal file
@@ -0,0 +1,213 @@
|
||||
// @flow
|
||||
|
||||
import type { Options } from "../options";
|
||||
import * as N from "../types";
|
||||
import { Position } from "../util/location";
|
||||
|
||||
import { types as ct, type TokContext } from "./context";
|
||||
import type { Token } from "./index";
|
||||
import { types as tt, type TokenType } from "./types";
|
||||
|
||||
export default class State {
|
||||
init(options: Options, input: string): void {
|
||||
this.strict =
|
||||
options.strictMode === false ? false : options.sourceType === "module";
|
||||
|
||||
this.input = input;
|
||||
|
||||
this.potentialArrowAt = -1;
|
||||
|
||||
this.noArrowAt = [];
|
||||
this.noArrowParamsConversionAt = [];
|
||||
|
||||
this.inMethod = false;
|
||||
this.inFunction = false;
|
||||
this.inParameters = false;
|
||||
this.maybeInArrowParameters = false;
|
||||
this.inGenerator = false;
|
||||
this.inAsync = false;
|
||||
this.inPropertyName = false;
|
||||
this.inType = false;
|
||||
this.inClassProperty = false;
|
||||
this.noAnonFunctionType = false;
|
||||
this.hasFlowComment = false;
|
||||
this.isIterator = false;
|
||||
|
||||
this.classLevel = 0;
|
||||
|
||||
this.labels = [];
|
||||
|
||||
this.decoratorStack = [[]];
|
||||
|
||||
this.yieldInPossibleArrowParameters = null;
|
||||
|
||||
this.tokens = [];
|
||||
|
||||
this.comments = [];
|
||||
|
||||
this.trailingComments = [];
|
||||
this.leadingComments = [];
|
||||
this.commentStack = [];
|
||||
// $FlowIgnore
|
||||
this.commentPreviousNode = null;
|
||||
|
||||
this.pos = this.lineStart = 0;
|
||||
this.curLine = options.startLine;
|
||||
|
||||
this.type = tt.eof;
|
||||
this.value = null;
|
||||
this.start = this.end = this.pos;
|
||||
this.startLoc = this.endLoc = this.curPosition();
|
||||
|
||||
// $FlowIgnore
|
||||
this.lastTokEndLoc = this.lastTokStartLoc = null;
|
||||
this.lastTokStart = this.lastTokEnd = this.pos;
|
||||
|
||||
this.context = [ct.braceStatement];
|
||||
this.exprAllowed = true;
|
||||
|
||||
this.containsEsc = this.containsOctal = false;
|
||||
this.octalPosition = null;
|
||||
|
||||
this.invalidTemplateEscapePosition = null;
|
||||
|
||||
this.exportedIdentifiers = [];
|
||||
}
|
||||
|
||||
// TODO
|
||||
strict: boolean;
|
||||
|
||||
// TODO
|
||||
input: string;
|
||||
|
||||
// Used to signify the start of a potential arrow function
|
||||
potentialArrowAt: number;
|
||||
|
||||
// Used to signify the start of an expression which looks like a
|
||||
// typed arrow function, but it isn't
|
||||
// e.g. a ? (b) : c => d
|
||||
// ^
|
||||
noArrowAt: number[];
|
||||
|
||||
// Used to signify the start of an expression whose params, if it looks like
|
||||
// an arrow function, shouldn't be converted to assignable nodes.
|
||||
// This is used to defer the validation of typed arrow functions inside
|
||||
// conditional expressions.
|
||||
// e.g. a ? (b) : c => d
|
||||
// ^
|
||||
noArrowParamsConversionAt: number[];
|
||||
|
||||
// Flags to track whether we are in a function, a generator.
|
||||
inFunction: boolean;
|
||||
inParameters: boolean;
|
||||
maybeInArrowParameters: boolean;
|
||||
inGenerator: boolean;
|
||||
inMethod: boolean | N.MethodKind;
|
||||
inAsync: boolean;
|
||||
inType: boolean;
|
||||
noAnonFunctionType: boolean;
|
||||
inPropertyName: boolean;
|
||||
inClassProperty: boolean;
|
||||
hasFlowComment: boolean;
|
||||
isIterator: boolean;
|
||||
|
||||
// Check whether we are in a (nested) class or not.
|
||||
classLevel: number;
|
||||
|
||||
// Labels in scope.
|
||||
labels: Array<{ kind: ?("loop" | "switch"), statementStart?: number }>;
|
||||
|
||||
// Leading decorators. Last element of the stack represents the decorators in current context.
|
||||
// Supports nesting of decorators, e.g. @foo(@bar class inner {}) class outer {}
|
||||
// where @foo belongs to the outer class and @bar to the inner
|
||||
decoratorStack: Array<Array<N.Decorator>>;
|
||||
|
||||
// The first yield expression inside parenthesized expressions and arrow
|
||||
// function parameters. It is used to disallow yield in arrow function
|
||||
// parameters.
|
||||
yieldInPossibleArrowParameters: ?N.YieldExpression;
|
||||
|
||||
// Token store.
|
||||
tokens: Array<Token | N.Comment>;
|
||||
|
||||
// Comment store.
|
||||
comments: Array<N.Comment>;
|
||||
|
||||
// Comment attachment store
|
||||
trailingComments: Array<N.Comment>;
|
||||
leadingComments: Array<N.Comment>;
|
||||
commentStack: Array<{
|
||||
start: number,
|
||||
leadingComments: ?Array<N.Comment>,
|
||||
trailingComments: ?Array<N.Comment>,
|
||||
}>;
|
||||
commentPreviousNode: N.Node;
|
||||
|
||||
// The current position of the tokenizer in the input.
|
||||
pos: number;
|
||||
lineStart: number;
|
||||
curLine: number;
|
||||
|
||||
// Properties of the current token:
|
||||
// Its type
|
||||
type: TokenType;
|
||||
|
||||
// For tokens that include more information than their type, the value
|
||||
value: any;
|
||||
|
||||
// Its start and end offset
|
||||
start: number;
|
||||
end: number;
|
||||
|
||||
// And, if locations are used, the {line, column} object
|
||||
// corresponding to those offsets
|
||||
startLoc: Position;
|
||||
endLoc: Position;
|
||||
|
||||
// Position information for the previous token
|
||||
lastTokEndLoc: Position;
|
||||
lastTokStartLoc: Position;
|
||||
lastTokStart: number;
|
||||
lastTokEnd: number;
|
||||
|
||||
// The context stack is used to superficially track syntactic
|
||||
// context to predict whether a regular expression is allowed in a
|
||||
// given position.
|
||||
context: Array<TokContext>;
|
||||
exprAllowed: boolean;
|
||||
|
||||
// Used to signal to callers of `readWord1` whether the word
|
||||
// contained any escape sequences. This is needed because words with
|
||||
// escape sequences must not be interpreted as keywords.
|
||||
containsEsc: boolean;
|
||||
|
||||
// TODO
|
||||
containsOctal: boolean;
|
||||
octalPosition: ?number;
|
||||
|
||||
// Names of exports store. `default` is stored as a name for both
|
||||
// `export default foo;` and `export { foo as default };`.
|
||||
exportedIdentifiers: Array<string>;
|
||||
|
||||
invalidTemplateEscapePosition: ?number;
|
||||
|
||||
curPosition(): Position {
|
||||
return new Position(this.curLine, this.pos - this.lineStart);
|
||||
}
|
||||
|
||||
clone(skipArrays?: boolean): State {
|
||||
const state = new State();
|
||||
Object.keys(this).forEach(key => {
|
||||
// $FlowIgnore
|
||||
let val = this[key];
|
||||
|
||||
if ((!skipArrays || key === "context") && Array.isArray(val)) {
|
||||
val = val.slice();
|
||||
}
|
||||
|
||||
// $FlowIgnore
|
||||
state[key] = val;
|
||||
});
|
||||
return state;
|
||||
}
|
||||
}
|
||||
198
packages/babel-parser/src/tokenizer/types.js
Normal file
198
packages/babel-parser/src/tokenizer/types.js
Normal file
@@ -0,0 +1,198 @@
|
||||
// @flow
|
||||
|
||||
// ## Token types
|
||||
|
||||
// The assignment of fine-grained, information-carrying type objects
|
||||
// allows the tokenizer to store the information it has about a
|
||||
// token in a way that is very cheap for the parser to look up.
|
||||
|
||||
// All token type variables start with an underscore, to make them
|
||||
// easy to recognize.
|
||||
|
||||
// The `beforeExpr` property is used to disambiguate between regular
|
||||
// expressions and divisions. It is set on all token types that can
|
||||
// be followed by an expression (thus, a slash after them would be a
|
||||
// regular expression).
|
||||
//
|
||||
// `isLoop` marks a keyword as starting a loop, which is important
|
||||
// to know when parsing a label, in order to allow or disallow
|
||||
// continue jumps to that label.
|
||||
|
||||
const beforeExpr = true;
|
||||
const startsExpr = true;
|
||||
const isLoop = true;
|
||||
const isAssign = true;
|
||||
const prefix = true;
|
||||
const postfix = true;
|
||||
|
||||
type TokenOptions = {
|
||||
keyword?: string,
|
||||
|
||||
beforeExpr?: boolean,
|
||||
startsExpr?: boolean,
|
||||
rightAssociative?: boolean,
|
||||
isLoop?: boolean,
|
||||
isAssign?: boolean,
|
||||
prefix?: boolean,
|
||||
postfix?: boolean,
|
||||
binop?: ?number,
|
||||
};
|
||||
|
||||
export class TokenType {
|
||||
label: string;
|
||||
keyword: ?string;
|
||||
beforeExpr: boolean;
|
||||
startsExpr: boolean;
|
||||
rightAssociative: boolean;
|
||||
isLoop: boolean;
|
||||
isAssign: boolean;
|
||||
prefix: boolean;
|
||||
postfix: boolean;
|
||||
binop: ?number;
|
||||
updateContext: ?(prevType: TokenType) => void;
|
||||
|
||||
constructor(label: string, conf: TokenOptions = {}) {
|
||||
this.label = label;
|
||||
this.keyword = conf.keyword;
|
||||
this.beforeExpr = !!conf.beforeExpr;
|
||||
this.startsExpr = !!conf.startsExpr;
|
||||
this.rightAssociative = !!conf.rightAssociative;
|
||||
this.isLoop = !!conf.isLoop;
|
||||
this.isAssign = !!conf.isAssign;
|
||||
this.prefix = !!conf.prefix;
|
||||
this.postfix = !!conf.postfix;
|
||||
this.binop = conf.binop === 0 ? 0 : conf.binop || null;
|
||||
this.updateContext = null;
|
||||
}
|
||||
}
|
||||
|
||||
class KeywordTokenType extends TokenType {
|
||||
constructor(name: string, options: TokenOptions = {}) {
|
||||
options.keyword = name;
|
||||
|
||||
super(name, options);
|
||||
}
|
||||
}
|
||||
|
||||
export class BinopTokenType extends TokenType {
|
||||
constructor(name: string, prec: number) {
|
||||
super(name, { beforeExpr, binop: prec });
|
||||
}
|
||||
}
|
||||
|
||||
export const types: { [name: string]: TokenType } = {
|
||||
num: new TokenType("num", { startsExpr }),
|
||||
bigint: new TokenType("bigint", { startsExpr }),
|
||||
regexp: new TokenType("regexp", { startsExpr }),
|
||||
string: new TokenType("string", { startsExpr }),
|
||||
name: new TokenType("name", { startsExpr }),
|
||||
eof: new TokenType("eof"),
|
||||
|
||||
// Punctuation token types.
|
||||
bracketL: new TokenType("[", { beforeExpr, startsExpr }),
|
||||
bracketR: new TokenType("]"),
|
||||
braceL: new TokenType("{", { beforeExpr, startsExpr }),
|
||||
braceBarL: new TokenType("{|", { beforeExpr, startsExpr }),
|
||||
braceR: new TokenType("}"),
|
||||
braceBarR: new TokenType("|}"),
|
||||
parenL: new TokenType("(", { beforeExpr, startsExpr }),
|
||||
parenR: new TokenType(")"),
|
||||
comma: new TokenType(",", { beforeExpr }),
|
||||
semi: new TokenType(";", { beforeExpr }),
|
||||
colon: new TokenType(":", { beforeExpr }),
|
||||
doubleColon: new TokenType("::", { beforeExpr }),
|
||||
dot: new TokenType("."),
|
||||
question: new TokenType("?", { beforeExpr }),
|
||||
questionDot: new TokenType("?."),
|
||||
arrow: new TokenType("=>", { beforeExpr }),
|
||||
template: new TokenType("template"),
|
||||
ellipsis: new TokenType("...", { beforeExpr }),
|
||||
backQuote: new TokenType("`", { startsExpr }),
|
||||
dollarBraceL: new TokenType("${", { beforeExpr, startsExpr }),
|
||||
at: new TokenType("@"),
|
||||
hash: new TokenType("#"),
|
||||
|
||||
// Operators. These carry several kinds of properties to help the
|
||||
// parser use them properly (the presence of these properties is
|
||||
// what categorizes them as operators).
|
||||
//
|
||||
// `binop`, when present, specifies that this operator is a binary
|
||||
// operator, and will refer to its precedence.
|
||||
//
|
||||
// `prefix` and `postfix` mark the operator as a prefix or postfix
|
||||
// unary operator.
|
||||
//
|
||||
// `isAssign` marks all of `=`, `+=`, `-=` etcetera, which act as
|
||||
// binary operators with a very low precedence, that should result
|
||||
// in AssignmentExpression nodes.
|
||||
|
||||
eq: new TokenType("=", { beforeExpr, isAssign }),
|
||||
assign: new TokenType("_=", { beforeExpr, isAssign }),
|
||||
incDec: new TokenType("++/--", { prefix, postfix, startsExpr }),
|
||||
bang: new TokenType("!", { beforeExpr, prefix, startsExpr }),
|
||||
tilde: new TokenType("~", { beforeExpr, prefix, startsExpr }),
|
||||
pipeline: new BinopTokenType("|>", 0),
|
||||
nullishCoalescing: new BinopTokenType("??", 1),
|
||||
logicalOR: new BinopTokenType("||", 1),
|
||||
logicalAND: new BinopTokenType("&&", 2),
|
||||
bitwiseOR: new BinopTokenType("|", 3),
|
||||
bitwiseXOR: new BinopTokenType("^", 4),
|
||||
bitwiseAND: new BinopTokenType("&", 5),
|
||||
equality: new BinopTokenType("==/!=", 6),
|
||||
relational: new BinopTokenType("</>", 7),
|
||||
bitShift: new BinopTokenType("<</>>", 8),
|
||||
plusMin: new TokenType("+/-", { beforeExpr, binop: 9, prefix, startsExpr }),
|
||||
modulo: new BinopTokenType("%", 10),
|
||||
star: new BinopTokenType("*", 10),
|
||||
slash: new BinopTokenType("/", 10),
|
||||
exponent: new TokenType("**", {
|
||||
beforeExpr,
|
||||
binop: 11,
|
||||
rightAssociative: true,
|
||||
}),
|
||||
};
|
||||
|
||||
export const keywords = {
|
||||
break: new KeywordTokenType("break"),
|
||||
case: new KeywordTokenType("case", { beforeExpr }),
|
||||
catch: new KeywordTokenType("catch"),
|
||||
continue: new KeywordTokenType("continue"),
|
||||
debugger: new KeywordTokenType("debugger"),
|
||||
default: new KeywordTokenType("default", { beforeExpr }),
|
||||
do: new KeywordTokenType("do", { isLoop, beforeExpr }),
|
||||
else: new KeywordTokenType("else", { beforeExpr }),
|
||||
finally: new KeywordTokenType("finally"),
|
||||
for: new KeywordTokenType("for", { isLoop }),
|
||||
function: new KeywordTokenType("function", { startsExpr }),
|
||||
if: new KeywordTokenType("if"),
|
||||
return: new KeywordTokenType("return", { beforeExpr }),
|
||||
switch: new KeywordTokenType("switch"),
|
||||
throw: new KeywordTokenType("throw", { beforeExpr, prefix, startsExpr }),
|
||||
try: new KeywordTokenType("try"),
|
||||
var: new KeywordTokenType("var"),
|
||||
let: new KeywordTokenType("let"),
|
||||
const: new KeywordTokenType("const"),
|
||||
while: new KeywordTokenType("while", { isLoop }),
|
||||
with: new KeywordTokenType("with"),
|
||||
new: new KeywordTokenType("new", { beforeExpr, startsExpr }),
|
||||
this: new KeywordTokenType("this", { startsExpr }),
|
||||
super: new KeywordTokenType("super", { startsExpr }),
|
||||
class: new KeywordTokenType("class"),
|
||||
extends: new KeywordTokenType("extends", { beforeExpr }),
|
||||
export: new KeywordTokenType("export"),
|
||||
import: new KeywordTokenType("import", { startsExpr }),
|
||||
yield: new KeywordTokenType("yield", { beforeExpr, startsExpr }),
|
||||
null: new KeywordTokenType("null", { startsExpr }),
|
||||
true: new KeywordTokenType("true", { startsExpr }),
|
||||
false: new KeywordTokenType("false", { startsExpr }),
|
||||
in: new KeywordTokenType("in", { beforeExpr, binop: 7 }),
|
||||
instanceof: new KeywordTokenType("instanceof", { beforeExpr, binop: 7 }),
|
||||
typeof: new KeywordTokenType("typeof", { beforeExpr, prefix, startsExpr }),
|
||||
void: new KeywordTokenType("void", { beforeExpr, prefix, startsExpr }),
|
||||
delete: new KeywordTokenType("delete", { beforeExpr, prefix, startsExpr }),
|
||||
};
|
||||
|
||||
// Map keyword names to token types.
|
||||
Object.keys(keywords).forEach(name => {
|
||||
types["_" + name] = keywords[name];
|
||||
});
|
||||
1343
packages/babel-parser/src/types.js
Normal file
1343
packages/babel-parser/src/types.js
Normal file
File diff suppressed because it is too large
Load Diff
112
packages/babel-parser/src/util/identifier.js
Normal file
112
packages/babel-parser/src/util/identifier.js
Normal file
@@ -0,0 +1,112 @@
|
||||
/* eslint max-len: 0 */
|
||||
|
||||
// @flow
|
||||
|
||||
function makePredicate(words: string): (str: string) => boolean {
|
||||
const wordsArr = words.split(" ");
|
||||
return function(str) {
|
||||
return wordsArr.indexOf(str) >= 0;
|
||||
};
|
||||
}
|
||||
|
||||
// Reserved word lists for various dialects of the language
|
||||
|
||||
export const reservedWords = {
|
||||
"6": makePredicate("enum await"),
|
||||
strict: makePredicate(
|
||||
"implements interface let package private protected public static yield",
|
||||
),
|
||||
strictBind: makePredicate("eval arguments"),
|
||||
};
|
||||
|
||||
// And the keywords
|
||||
|
||||
export const isKeyword = makePredicate(
|
||||
"break case catch continue debugger default do else finally for function if return switch throw try var while with null true false instanceof typeof void delete new in this let const class extends export import yield super",
|
||||
);
|
||||
|
||||
// ## Character categories
|
||||
|
||||
// Big ugly regular expressions that match characters in the
|
||||
// whitespace, identifier, and identifier-start categories. These
|
||||
// are only applied when a character is found to actually have a
|
||||
// code point above 128.
|
||||
// Generated by `bin/generate-identifier-regex.js`.
|
||||
|
||||
/* prettier-ignore */
|
||||
let nonASCIIidentifierStartChars = "\xaa\xb5\xba\xc0-\xd6\xd8-\xf6\xf8-\u02c1\u02c6-\u02d1\u02e0-\u02e4\u02ec\u02ee\u0370-\u0374\u0376\u0377\u037a-\u037d\u037f\u0386\u0388-\u038a\u038c\u038e-\u03a1\u03a3-\u03f5\u03f7-\u0481\u048a-\u052f\u0531-\u0556\u0559\u0561-\u0587\u05d0-\u05ea\u05f0-\u05f2\u0620-\u064a\u066e\u066f\u0671-\u06d3\u06d5\u06e5\u06e6\u06ee\u06ef\u06fa-\u06fc\u06ff\u0710\u0712-\u072f\u074d-\u07a5\u07b1\u07ca-\u07ea\u07f4\u07f5\u07fa\u0800-\u0815\u081a\u0824\u0828\u0840-\u0858\u0860-\u086a\u08a0-\u08b4\u08b6-\u08bd\u0904-\u0939\u093d\u0950\u0958-\u0961\u0971-\u0980\u0985-\u098c\u098f\u0990\u0993-\u09a8\u09aa-\u09b0\u09b2\u09b6-\u09b9\u09bd\u09ce\u09dc\u09dd\u09df-\u09e1\u09f0\u09f1\u09fc\u0a05-\u0a0a\u0a0f\u0a10\u0a13-\u0a28\u0a2a-\u0a30\u0a32\u0a33\u0a35\u0a36\u0a38\u0a39\u0a59-\u0a5c\u0a5e\u0a72-\u0a74\u0a85-\u0a8d\u0a8f-\u0a91\u0a93-\u0aa8\u0aaa-\u0ab0\u0ab2\u0ab3\u0ab5-\u0ab9\u0abd\u0ad0\u0ae0\u0ae1\u0af9\u0b05-\u0b0c\u0b0f\u0b10\u0b13-\u0b28\u0b2a-\u0b30\u0b32\u0b33\u0b35-\u0b39\u0b3d\u0b5c\u0b5d\u0b5f-\u0b61\u0b71\u0b83\u0b85-\u0b8a\u0b8e-\u0b90\u0b92-\u0b95\u0b99\u0b9a\u0b9c\u0b9e\u0b9f\u0ba3\u0ba4\u0ba8-\u0baa\u0bae-\u0bb9\u0bd0\u0c05-\u0c0c\u0c0e-\u0c10\u0c12-\u0c28\u0c2a-\u0c39\u0c3d\u0c58-\u0c5a\u0c60\u0c61\u0c80\u0c85-\u0c8c\u0c8e-\u0c90\u0c92-\u0ca8\u0caa-\u0cb3\u0cb5-\u0cb9\u0cbd\u0cde\u0ce0\u0ce1\u0cf1\u0cf2\u0d05-\u0d0c\u0d0e-\u0d10\u0d12-\u0d3a\u0d3d\u0d4e\u0d54-\u0d56\u0d5f-\u0d61\u0d7a-\u0d7f\u0d85-\u0d96\u0d9a-\u0db1\u0db3-\u0dbb\u0dbd\u0dc0-\u0dc6\u0e01-\u0e30\u0e32\u0e33\u0e40-\u0e46\u0e81\u0e82\u0e84\u0e87\u0e88\u0e8a\u0e8d\u0e94-\u0e97\u0e99-\u0e9f\u0ea1-\u0ea3\u0ea5\u0ea7\u0eaa\u0eab\u0ead-\u0eb0\u0eb2\u0eb3\u0ebd\u0ec0-\u0ec4\u0ec6\u0edc-\u0edf\u0f00\u0f40-\u0f47\u0f49-\u0f6c\u0f88-\u0f8c\u1000-\u102a\u103f\u1050-\u1055\u105a-\u105d\u1061\u1065\u1066\u106e-\u1070\u1075-\u1081\u108e\u10a0-\u10c5\u10c7\u10cd\u10d0-\u10fa\u10fc-\u1248\u124a-\u124d\u1250-\u1256\u1258\u125a-\u125d\u1260-\u1288\u128a-\u128d\u1290-\u12b0\u12b2-\u12b5\u12b8-\u12be\u12c0\u12c2-\u12c5\u12c8-\u12d6\u12d8-\u1310\u1312-\u1315\u1318-\u135a\u1380-\u138f\u13a0-\u13f5\u13f8-\u13fd\u1401-\u166c\u166f-\u167f\u1681-\u169a\u16a0-\u16ea\u16ee-\u16f8\u1700-\u170c\u170e-\u1711\u1720-\u1731\u1740-\u1751\u1760-\u176c\u176e-\u1770\u1780-\u17b3\u17d7\u17dc\u1820-\u1877\u1880-\u18a8\u18aa\u18b0-\u18f5\u1900-\u191e\u1950-\u196d\u1970-\u1974\u1980-\u19ab\u19b0-\u19c9\u1a00-\u1a16\u1a20-\u1a54\u1aa7\u1b05-\u1b33\u1b45-\u1b4b\u1b83-\u1ba0\u1bae\u1baf\u1bba-\u1be5\u1c00-\u1c23\u1c4d-\u1c4f\u1c5a-\u1c7d\u1c80-\u1c88\u1ce9-\u1cec\u1cee-\u1cf1\u1cf5\u1cf6\u1d00-\u1dbf\u1e00-\u1f15\u1f18-\u1f1d\u1f20-\u1f45\u1f48-\u1f4d\u1f50-\u1f57\u1f59\u1f5b\u1f5d\u1f5f-\u1f7d\u1f80-\u1fb4\u1fb6-\u1fbc\u1fbe\u1fc2-\u1fc4\u1fc6-\u1fcc\u1fd0-\u1fd3\u1fd6-\u1fdb\u1fe0-\u1fec\u1ff2-\u1ff4\u1ff6-\u1ffc\u2071\u207f\u2090-\u209c\u2102\u2107\u210a-\u2113\u2115\u2118-\u211d\u2124\u2126\u2128\u212a-\u2139\u213c-\u213f\u2145-\u2149\u214e\u2160-\u2188\u2c00-\u2c2e\u2c30-\u2c5e\u2c60-\u2ce4\u2ceb-\u2cee\u2cf2\u2cf3\u2d00-\u2d25\u2d27\u2d2d\u2d30-\u2d67\u2d6f\u2d80-\u2d96\u2da0-\u2da6\u2da8-\u2dae\u2db0-\u2db6\u2db8-\u2dbe\u2dc0-\u2dc6\u2dc8-\u2dce\u2dd0-\u2dd6\u2dd8-\u2dde\u3005-\u3007\u3021-\u3029\u3031-\u3035\u3038-\u303c\u3041-\u3096\u309b-\u309f\u30a1-\u30fa\u30fc-\u30ff\u3105-\u312e\u3131-\u318e\u31a0-\u31ba\u31f0-\u31ff\u3400-\u4db5\u4e00-\u9fea\ua000-\ua48c\ua4d0-\ua4fd\ua500-\ua60c\ua610-\ua61f\ua62a\ua62b\ua640-\ua66e\ua67f-\ua69d\ua6a0-\ua6ef\ua717-\ua71f\ua722-\ua788\ua78b-\ua7ae\ua7b0-\ua7b7\ua7f7-\ua801\ua803-\ua805\ua807-\ua80a\ua80c-\ua822\ua840-\ua873\ua882-\ua8b3\ua8f2-\ua8f7\ua8fb\ua8fd\ua90a-\ua925\ua930-\ua946\ua960-\ua97c\ua984-\ua9b2\ua9cf\ua9e0-\ua9e4\ua9e6-\ua9ef\ua9fa-\ua9fe\uaa00-\uaa28\uaa40-\uaa42\uaa44-\uaa4b\uaa60-\uaa76\uaa7a\uaa7e-\uaaaf\uaab1\uaab5\uaab6\uaab9-\uaabd\uaac0\uaac2\uaadb-\uaadd\uaae0-\uaaea\uaaf2-\uaaf4\uab01-\uab06\uab09-\uab0e\uab11-\uab16\uab20-\uab26\uab28-\uab2e\uab30-\uab5a\uab5c-\uab65\uab70-\uabe2\uac00-\ud7a3\ud7b0-\ud7c6\ud7cb-\ud7fb\uf900-\ufa6d\ufa70-\ufad9\ufb00-\ufb06\ufb13-\ufb17\ufb1d\ufb1f-\ufb28\ufb2a-\ufb36\ufb38-\ufb3c\ufb3e\ufb40\ufb41\ufb43\ufb44\ufb46-\ufbb1\ufbd3-\ufd3d\ufd50-\ufd8f\ufd92-\ufdc7\ufdf0-\ufdfb\ufe70-\ufe74\ufe76-\ufefc\uff21-\uff3a\uff41-\uff5a\uff66-\uffbe\uffc2-\uffc7\uffca-\uffcf\uffd2-\uffd7\uffda-\uffdc";
|
||||
/* prettier-ignore */
|
||||
let nonASCIIidentifierChars = "\u200c\u200d\xb7\u0300-\u036f\u0387\u0483-\u0487\u0591-\u05bd\u05bf\u05c1\u05c2\u05c4\u05c5\u05c7\u0610-\u061a\u064b-\u0669\u0670\u06d6-\u06dc\u06df-\u06e4\u06e7\u06e8\u06ea-\u06ed\u06f0-\u06f9\u0711\u0730-\u074a\u07a6-\u07b0\u07c0-\u07c9\u07eb-\u07f3\u0816-\u0819\u081b-\u0823\u0825-\u0827\u0829-\u082d\u0859-\u085b\u08d4-\u08e1\u08e3-\u0903\u093a-\u093c\u093e-\u094f\u0951-\u0957\u0962\u0963\u0966-\u096f\u0981-\u0983\u09bc\u09be-\u09c4\u09c7\u09c8\u09cb-\u09cd\u09d7\u09e2\u09e3\u09e6-\u09ef\u0a01-\u0a03\u0a3c\u0a3e-\u0a42\u0a47\u0a48\u0a4b-\u0a4d\u0a51\u0a66-\u0a71\u0a75\u0a81-\u0a83\u0abc\u0abe-\u0ac5\u0ac7-\u0ac9\u0acb-\u0acd\u0ae2\u0ae3\u0ae6-\u0aef\u0afa-\u0aff\u0b01-\u0b03\u0b3c\u0b3e-\u0b44\u0b47\u0b48\u0b4b-\u0b4d\u0b56\u0b57\u0b62\u0b63\u0b66-\u0b6f\u0b82\u0bbe-\u0bc2\u0bc6-\u0bc8\u0bca-\u0bcd\u0bd7\u0be6-\u0bef\u0c00-\u0c03\u0c3e-\u0c44\u0c46-\u0c48\u0c4a-\u0c4d\u0c55\u0c56\u0c62\u0c63\u0c66-\u0c6f\u0c81-\u0c83\u0cbc\u0cbe-\u0cc4\u0cc6-\u0cc8\u0cca-\u0ccd\u0cd5\u0cd6\u0ce2\u0ce3\u0ce6-\u0cef\u0d00-\u0d03\u0d3b\u0d3c\u0d3e-\u0d44\u0d46-\u0d48\u0d4a-\u0d4d\u0d57\u0d62\u0d63\u0d66-\u0d6f\u0d82\u0d83\u0dca\u0dcf-\u0dd4\u0dd6\u0dd8-\u0ddf\u0de6-\u0def\u0df2\u0df3\u0e31\u0e34-\u0e3a\u0e47-\u0e4e\u0e50-\u0e59\u0eb1\u0eb4-\u0eb9\u0ebb\u0ebc\u0ec8-\u0ecd\u0ed0-\u0ed9\u0f18\u0f19\u0f20-\u0f29\u0f35\u0f37\u0f39\u0f3e\u0f3f\u0f71-\u0f84\u0f86\u0f87\u0f8d-\u0f97\u0f99-\u0fbc\u0fc6\u102b-\u103e\u1040-\u1049\u1056-\u1059\u105e-\u1060\u1062-\u1064\u1067-\u106d\u1071-\u1074\u1082-\u108d\u108f-\u109d\u135d-\u135f\u1369-\u1371\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17b4-\u17d3\u17dd\u17e0-\u17e9\u180b-\u180d\u1810-\u1819\u18a9\u1920-\u192b\u1930-\u193b\u1946-\u194f\u19d0-\u19da\u1a17-\u1a1b\u1a55-\u1a5e\u1a60-\u1a7c\u1a7f-\u1a89\u1a90-\u1a99\u1ab0-\u1abd\u1b00-\u1b04\u1b34-\u1b44\u1b50-\u1b59\u1b6b-\u1b73\u1b80-\u1b82\u1ba1-\u1bad\u1bb0-\u1bb9\u1be6-\u1bf3\u1c24-\u1c37\u1c40-\u1c49\u1c50-\u1c59\u1cd0-\u1cd2\u1cd4-\u1ce8\u1ced\u1cf2-\u1cf4\u1cf7-\u1cf9\u1dc0-\u1df9\u1dfb-\u1dff\u203f\u2040\u2054\u20d0-\u20dc\u20e1\u20e5-\u20f0\u2cef-\u2cf1\u2d7f\u2de0-\u2dff\u302a-\u302f\u3099\u309a\ua620-\ua629\ua66f\ua674-\ua67d\ua69e\ua69f\ua6f0\ua6f1\ua802\ua806\ua80b\ua823-\ua827\ua880\ua881\ua8b4-\ua8c5\ua8d0-\ua8d9\ua8e0-\ua8f1\ua900-\ua909\ua926-\ua92d\ua947-\ua953\ua980-\ua983\ua9b3-\ua9c0\ua9d0-\ua9d9\ua9e5\ua9f0-\ua9f9\uaa29-\uaa36\uaa43\uaa4c\uaa4d\uaa50-\uaa59\uaa7b-\uaa7d\uaab0\uaab2-\uaab4\uaab7\uaab8\uaabe\uaabf\uaac1\uaaeb-\uaaef\uaaf5\uaaf6\uabe3-\uabea\uabec\uabed\uabf0-\uabf9\ufb1e\ufe00-\ufe0f\ufe20-\ufe2f\ufe33\ufe34\ufe4d-\ufe4f\uff10-\uff19\uff3f";
|
||||
|
||||
const nonASCIIidentifierStart = new RegExp(
|
||||
"[" + nonASCIIidentifierStartChars + "]",
|
||||
);
|
||||
const nonASCIIidentifier = new RegExp(
|
||||
"[" + nonASCIIidentifierStartChars + nonASCIIidentifierChars + "]",
|
||||
);
|
||||
|
||||
nonASCIIidentifierStartChars = nonASCIIidentifierChars = null;
|
||||
|
||||
// These are a run-length and offset encoded representation of the
|
||||
// >0xffff code points that are a valid part of identifiers. The
|
||||
// offset starts at 0x10000, and each pair of numbers represents an
|
||||
// offset to the next range, and then a size of the range. They were
|
||||
// generated by `bin/generate-identifier-regex.js`.
|
||||
/* prettier-ignore */
|
||||
const astralIdentifierStartCodes = [0,11,2,25,2,18,2,1,2,14,3,13,35,122,70,52,268,28,4,48,48,31,14,29,6,37,11,29,3,35,5,7,2,4,43,157,19,35,5,35,5,39,9,51,157,310,10,21,11,7,153,5,3,0,2,43,2,1,4,0,3,22,11,22,10,30,66,18,2,1,11,21,11,25,71,55,7,1,65,0,16,3,2,2,2,26,45,28,4,28,36,7,2,27,28,53,11,21,11,18,14,17,111,72,56,50,14,50,785,52,76,44,33,24,27,35,42,34,4,0,13,47,15,3,22,0,2,0,36,17,2,24,85,6,2,0,2,3,2,14,2,9,8,46,39,7,3,1,3,21,2,6,2,1,2,4,4,0,19,0,13,4,159,52,19,3,54,47,21,1,2,0,185,46,42,3,37,47,21,0,60,42,86,25,391,63,32,0,257,0,11,39,8,0,22,0,12,39,3,3,55,56,264,8,2,36,18,0,50,29,113,6,2,1,2,37,22,0,698,921,103,110,18,195,2749,1070,4050,582,8634,568,8,30,114,29,19,47,17,3,32,20,6,18,881,68,12,0,67,12,65,1,31,6124,20,754,9486,286,82,395,2309,106,6,12,4,8,8,9,5991,84,2,70,2,1,3,0,3,1,3,3,2,11,2,0,2,6,2,64,2,3,3,7,2,6,2,27,2,3,2,4,2,0,4,6,2,339,3,24,2,24,2,30,2,24,2,30,2,24,2,30,2,24,2,30,2,24,2,7,4149,196,60,67,1213,3,2,26,2,1,2,0,3,0,2,9,2,3,2,0,2,0,7,0,5,0,2,0,2,0,2,2,2,1,2,0,3,0,2,0,2,0,2,0,2,0,2,1,2,0,3,3,2,6,2,3,2,3,2,0,2,9,2,16,6,2,2,4,2,16,4421,42710,42,4148,12,221,3,5761,15,7472,3104,541];
|
||||
/* prettier-ignore */
|
||||
const astralIdentifierCodes = [509,0,227,0,150,4,294,9,1368,2,2,1,6,3,41,2,5,0,166,1,1306,2,54,14,32,9,16,3,46,10,54,9,7,2,37,13,2,9,52,0,13,2,49,13,10,2,4,9,83,11,7,0,161,11,6,9,7,3,57,0,2,6,3,1,3,2,10,0,11,1,3,6,4,4,193,17,10,9,87,19,13,9,214,6,3,8,28,1,83,16,16,9,82,12,9,9,84,14,5,9,423,9,280,9,41,6,2,3,9,0,10,10,47,15,406,7,2,7,17,9,57,21,2,13,123,5,4,0,2,1,2,6,2,0,9,9,19719,9,135,4,60,6,26,9,1016,45,17,3,19723,1,5319,4,4,5,9,7,3,6,31,3,149,2,1418,49,513,54,5,49,9,0,15,0,23,4,2,14,1361,6,2,16,3,6,2,1,2,4,2214,6,110,6,6,9,792487,239];
|
||||
|
||||
// This has a complexity linear to the value of the code. The
|
||||
// assumption is that looking up astral identifier characters is
|
||||
// rare.
|
||||
function isInAstralSet(code: number, set: $ReadOnlyArray<number>): boolean {
|
||||
let pos = 0x10000;
|
||||
for (let i = 0; i < set.length; i += 2) {
|
||||
pos += set[i];
|
||||
if (pos > code) return false;
|
||||
|
||||
pos += set[i + 1];
|
||||
if (pos >= code) return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
// Test whether a given character code starts an identifier.
|
||||
|
||||
export function isIdentifierStart(code: number): boolean {
|
||||
if (code < 65) return code === 36;
|
||||
if (code < 91) return true;
|
||||
if (code < 97) return code === 95;
|
||||
if (code < 123) return true;
|
||||
if (code <= 0xffff) {
|
||||
return (
|
||||
code >= 0xaa && nonASCIIidentifierStart.test(String.fromCharCode(code))
|
||||
);
|
||||
}
|
||||
return isInAstralSet(code, astralIdentifierStartCodes);
|
||||
}
|
||||
|
||||
// Test whether a current state character code and next character code is @
|
||||
|
||||
export function isIteratorStart(current: number, next: number): boolean {
|
||||
return current === 64 && next === 64;
|
||||
}
|
||||
|
||||
// Test whether a given character is part of an identifier.
|
||||
|
||||
export function isIdentifierChar(code: number): boolean {
|
||||
if (code < 48) return code === 36;
|
||||
if (code < 58) return true;
|
||||
if (code < 65) return false;
|
||||
if (code < 91) return true;
|
||||
if (code < 97) return code === 95;
|
||||
if (code < 123) return true;
|
||||
if (code <= 0xffff) {
|
||||
return code >= 0xaa && nonASCIIidentifier.test(String.fromCharCode(code));
|
||||
}
|
||||
return (
|
||||
isInAstralSet(code, astralIdentifierStartCodes) ||
|
||||
isInAstralSet(code, astralIdentifierCodes)
|
||||
);
|
||||
}
|
||||
54
packages/babel-parser/src/util/location.js
Normal file
54
packages/babel-parser/src/util/location.js
Normal file
@@ -0,0 +1,54 @@
|
||||
// @flow
|
||||
|
||||
import { lineBreakG } from "./whitespace";
|
||||
|
||||
export type Pos = {
|
||||
start: number,
|
||||
};
|
||||
|
||||
// These are used when `options.locations` is on, for the
|
||||
// `startLoc` and `endLoc` properties.
|
||||
|
||||
export class Position {
|
||||
line: number;
|
||||
column: number;
|
||||
|
||||
constructor(line: number, col: number) {
|
||||
this.line = line;
|
||||
this.column = col;
|
||||
}
|
||||
}
|
||||
|
||||
export class SourceLocation {
|
||||
start: Position;
|
||||
end: Position;
|
||||
filename: string;
|
||||
identifierName: ?string;
|
||||
|
||||
constructor(start: Position, end?: Position) {
|
||||
this.start = start;
|
||||
// $FlowIgnore (may start as null, but initialized later)
|
||||
this.end = end;
|
||||
}
|
||||
}
|
||||
|
||||
// The `getLineInfo` function is mostly useful when the
|
||||
// `locations` option is off (for performance reasons) and you
|
||||
// want to find the line/column position for a given character
|
||||
// offset. `input` should be the code string that the offset refers
|
||||
// into.
|
||||
|
||||
export function getLineInfo(input: string, offset: number): Position {
|
||||
for (let line = 1, cur = 0; ; ) {
|
||||
lineBreakG.lastIndex = cur;
|
||||
const match = lineBreakG.exec(input);
|
||||
if (match && match.index < offset) {
|
||||
++line;
|
||||
cur = match.index + match[0].length;
|
||||
} else {
|
||||
return new Position(line, offset - cur);
|
||||
}
|
||||
}
|
||||
// istanbul ignore next
|
||||
throw new Error("Unreachable");
|
||||
}
|
||||
13
packages/babel-parser/src/util/whitespace.js
Normal file
13
packages/babel-parser/src/util/whitespace.js
Normal file
@@ -0,0 +1,13 @@
|
||||
// @flow
|
||||
|
||||
// Matches a whole line break (where CRLF is considered a single
|
||||
// line break). Used to count lines.
|
||||
|
||||
export const lineBreak = /\r\n?|\n|\u2028|\u2029/;
|
||||
export const lineBreakG = new RegExp(lineBreak.source, "g");
|
||||
|
||||
export function isNewLine(code: number): boolean {
|
||||
return code === 10 || code === 13 || code === 0x2028 || code === 0x2029;
|
||||
}
|
||||
|
||||
export const nonASCIIwhitespace = /[\u1680\u180e\u2000-\u200a\u202f\u205f\u3000\ufeff]/;
|
||||
5
packages/babel-parser/test/estree-throws.js
Normal file
5
packages/babel-parser/test/estree-throws.js
Normal file
@@ -0,0 +1,5 @@
|
||||
import path from "path";
|
||||
import { runThrowTestsWithEstree } from "./helpers/runFixtureTests";
|
||||
import { parse } from "../lib";
|
||||
|
||||
runThrowTestsWithEstree(path.join(__dirname, "fixtures"), parse);
|
||||
5
packages/babel-parser/test/expressions.js
Normal file
5
packages/babel-parser/test/expressions.js
Normal file
@@ -0,0 +1,5 @@
|
||||
import path from "path";
|
||||
import { runFixtureTests } from "./helpers/runFixtureTests";
|
||||
import { parseExpression } from "../lib";
|
||||
|
||||
runFixtureTests(path.join(__dirname, "expressions"), parseExpression);
|
||||
21
packages/babel-parser/test/expressions/esprima/LICENSE
Normal file
21
packages/babel-parser/test/expressions/esprima/LICENSE
Normal file
@@ -0,0 +1,21 @@
|
||||
Copyright (c) jQuery Foundation, Inc. and Contributors, All Rights Reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright
|
||||
notice, this list of conditions and the following disclaimer.
|
||||
* Redistributions in binary form must reproduce the above copyright
|
||||
notice, this list of conditions and the following disclaimer in the
|
||||
documentation and/or other materials provided with the distribution.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
||||
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
|
||||
ARE DISCLAIMED. IN NO EVENT SHALL <COPYRIGHT HOLDER> BE LIABLE FOR ANY
|
||||
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
|
||||
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
@@ -0,0 +1 @@
|
||||
x + y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x - y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "-",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
"use strict" + 42
|
||||
@@ -0,0 +1,56 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 17,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 17
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "StringLiteral",
|
||||
"start": 0,
|
||||
"end": 12,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 12
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": "use strict",
|
||||
"raw": "\"use strict\""
|
||||
},
|
||||
"value": "use strict"
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 15,
|
||||
"end": 17,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 15
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 17
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x = 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
}
|
||||
},
|
||||
"operator": "=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 4,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
eval = 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"operator": "=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 4,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"identifierName": "eval"
|
||||
},
|
||||
"name": "eval"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 7,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
arguments = 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 14,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 14
|
||||
}
|
||||
},
|
||||
"operator": "=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"identifierName": "arguments"
|
||||
},
|
||||
"name": "arguments"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 12,
|
||||
"end": 14,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 12
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 14
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x *= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "*=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x /= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "/=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x %= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "%=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x += 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "+=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x -= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "-=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x <<= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 8,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
}
|
||||
},
|
||||
"operator": "<<=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 6,
|
||||
"end": 8,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x >>= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 8,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
}
|
||||
},
|
||||
"operator": ">>=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 6,
|
||||
"end": 8,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x >>>= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"operator": ">>>=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 7,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x &= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "&=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x ^= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "^=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x |= 42
|
||||
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"type": "AssignmentExpression",
|
||||
"start": 0,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"operator": "|=",
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"right": {
|
||||
"type": "NumericLiteral",
|
||||
"start": 5,
|
||||
"end": 7,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 7
|
||||
}
|
||||
},
|
||||
"extra": {
|
||||
"rawValue": 42,
|
||||
"raw": "42"
|
||||
},
|
||||
"value": 42
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x & y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "&",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x ^ y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "^",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x | y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "|",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x || y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "||",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 5,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x && y
|
||||
@@ -0,0 +1,50 @@
|
||||
{
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "&&",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 5,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x || y || z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "||",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 5,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
},
|
||||
"operator": "||",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 10,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 10
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x && y && z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "&&",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 5,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
},
|
||||
"operator": "&&",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 10,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 10
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x || y && z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "||",
|
||||
"right": {
|
||||
"type": "LogicalExpression",
|
||||
"start": 5,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 5,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
},
|
||||
"operator": "&&",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 10,
|
||||
"end": 11,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 10
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 11
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x || y ^ z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "LogicalExpression",
|
||||
"start": 0,
|
||||
"end": 10,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 10
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "||",
|
||||
"right": {
|
||||
"type": "BinaryExpression",
|
||||
"start": 5,
|
||||
"end": 10,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 10
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 5,
|
||||
"end": 6,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 6
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
},
|
||||
"operator": "^",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 9,
|
||||
"end": 10,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 10
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x + y + z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 8,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x - y + z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "-",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 8,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x + y - z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
},
|
||||
"operator": "-",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 8,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x - y - z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "-",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
}
|
||||
},
|
||||
"operator": "-",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 8,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x + y * z
|
||||
@@ -0,0 +1,83 @@
|
||||
{
|
||||
"type": "BinaryExpression",
|
||||
"start": 0,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 0,
|
||||
"end": 1,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 0
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 1
|
||||
},
|
||||
"identifierName": "x"
|
||||
},
|
||||
"name": "x"
|
||||
},
|
||||
"operator": "+",
|
||||
"right": {
|
||||
"type": "BinaryExpression",
|
||||
"start": 4,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
}
|
||||
},
|
||||
"left": {
|
||||
"type": "Identifier",
|
||||
"start": 4,
|
||||
"end": 5,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 4
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 5
|
||||
},
|
||||
"identifierName": "y"
|
||||
},
|
||||
"name": "y"
|
||||
},
|
||||
"operator": "*",
|
||||
"right": {
|
||||
"type": "Identifier",
|
||||
"start": 8,
|
||||
"end": 9,
|
||||
"loc": {
|
||||
"start": {
|
||||
"line": 1,
|
||||
"column": 8
|
||||
},
|
||||
"end": {
|
||||
"line": 1,
|
||||
"column": 9
|
||||
},
|
||||
"identifierName": "z"
|
||||
},
|
||||
"name": "z"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
x + y / z
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user