In all our pseudocode examples supposing a Japanese calendar thus far, we've been passing in era names as lowercased ASCII string identifiers made from English transliterations of the names, e.g. { year: 2, era: 'reiwa' } for the current year. That requires that these identifiers are unique. However, that seems not to be the case as there are several eras written with different kanji but with the
ECMAScript string values are a finite ordered sequence of zero or more 16-bit unsigned integer values. However, ECMAScript does not place any restrictions or requirements on the integer values except that they must be 16-bit unsigned integers. In well-formed strings, each integer value in the sequence represents a single 16-bit code unit of UTF-16-encoded Unicode text. However, not all sequences o
Many ECMAScript hosts and libraries have various ways of distinguishing types or operations via some kind of discriminant: ECMAScript: [Symbol.toStringTag] typeof DOM: Node.prototype.nodeType (Node.ATTRIBUTE_NODE, Node.CDATA_SECTION_NODE, etc.) DOMException.prototype.code (DOMException.ABORT_ERR, DOMException.DATA_CLONE_ERR, etc.) XMLHttpRequest.prototype.readyState (XMLHttpRequest.DONE, XMLHttpRe
リリース、障害情報などのサービスのお知らせ
最新の人気エントリーの配信
処理を実行中です
j次のブックマーク
k前のブックマーク
lあとで読む
eコメント一覧を開く
oページを開く