Skip to content

Commit e1ff6c5

Browse files
committed
Concatenate character tokens
Looks like these few places were missed when ParseError token type was removed. This PR fixes them to restore the state promised in the README: > All adjacent character tokens are coalesced into a single ["Character", data] token.
1 parent c9816cf commit e1ff6c5

File tree

3 files changed

+4
-4
lines changed

3 files changed

+4
-4
lines changed

tokenizer/test2.test

+2-2
Original file line numberDiff line numberDiff line change
@@ -195,7 +195,7 @@
195195

196196
{"description":"Unescaped <",
197197
"input":"foo < bar",
198-
"output":[["Character", "foo "], ["Character", "< bar"]],
198+
"output":[["Character", "foo < bar"]],
199199
"errors":[
200200
{ "code": "invalid-first-character-of-tag-name", "line": 1, "col": 6 }
201201
]},
@@ -242,7 +242,7 @@
242242

243243
{"description":"Empty end tag with following characters",
244244
"input":"a</>bc",
245-
"output":[["Character", "a"], ["Character", "bc"]],
245+
"output":[["Character", "abc"]],
246246
"errors":[
247247
{ "code": "missing-end-tag-name", "line": 1, "col": 4 }
248248
]},

tokenizer/test3.test

+1-1
Original file line numberDiff line numberDiff line change
@@ -8415,7 +8415,7 @@
84158415

84168416
{"description":"<<",
84178417
"input":"<<",
8418-
"output":[["Character", "<"], ["Character", "<"]],
8418+
"output":[["Character", "<<"]],
84198419
"errors":[
84208420
{ "code": "invalid-first-character-of-tag-name", "line": 1, "col": 2 },
84218421
{ "code": "eof-before-tag-name", "line": 1, "col": 3 }

tokenizer/unicodeCharsProblematic.test

+1-1
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@
1818
{"description": "Invalid Unicode character U+DFFF with valid preceding character",
1919
"doubleEscaped":true,
2020
"input": "a\\uDFFF",
21-
"output":[["Character", "a"], ["Character", "\\uDFFF"]],
21+
"output":[["Character", "a\\uDFFF"]],
2222
"errors":[
2323
{ "code": "surrogate-in-input-stream", "line": 1, "col": 2 }
2424
]},

0 commit comments

Comments
 (0)