diff --git a/.melos/base.yaml b/.melos/base.yaml index f5eedef..0c993bc 100644 --- a/.melos/base.yaml +++ b/.melos/base.yaml @@ -1,8 +1,11 @@ name: protevus_platform repository: https://github.com/protevus/platform packages: - - apps/** + - common/** + - drivers/** - packages/** + - incubation/** + - apps/** - helpers/tools/** - examples/** diff --git a/packages/route/AUTHORS.md b/common/body_parser/AUTHORS.md similarity index 100% rename from packages/route/AUTHORS.md rename to common/body_parser/AUTHORS.md diff --git a/common/body_parser/CHANGELOG.md b/common/body_parser/CHANGELOG.md new file mode 100644 index 0000000..7f1f98b --- /dev/null +++ b/common/body_parser/CHANGELOG.md @@ -0,0 +1,69 @@ +# Change Log + +## 5.3.0 + +* Require Dart >= 3.3 +* Updated `platform_http_server` to 4.4.0 +* Updated `lints` to 4.0.0 + +## 5.2.0 + +* Updated `lints` to 3.0.0 +* Updated `platform_http_server` to 4.2.0 + +## 5.1.0 + +* Updated `platform_http_server` to 4.1.1 +* Updated `http` to 1.0.0 + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 +* Updated `platform_http_server` to 4.0.0 + +## 4.0.1 + +* Updated `platform_http_server` to 3.0.0 + +## 4.0.0 + +* Require Dart >= 2.17 + +## 3.0.1 + +* Fixed boken license link + +## 3.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_body_parser` package +* Fixed linter warnings + +## 2.1.1 + +* Fixed calling deprecated methods in unit test + +## 2.1.0 + +* Replaced `http_server` with `platform_http_server` + +## 2.0.1 + +* Fixed source code formating warning +* Updated README + +## 2.0.0 + +* Migrated to support Dart SDK 2.12.x NNBD + +## 1.1.1 + +* Dart 2 updates; should fix Angel in Travis. + +## 1.1.0 + +* Add `parseBodyFromStream` diff --git a/common/body_parser/LICENSE b/common/body_parser/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/body_parser/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/body_parser/README.md b/common/body_parser/README.md new file mode 100644 index 0000000..f09cd26 --- /dev/null +++ b/common/body_parser/README.md @@ -0,0 +1,73 @@ +# Belatuk Body Parser + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_body_parser?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/body_parser/LICENSE) + +**Replacement of `package:body_parser` with breaking changes to support NNBD.** + +Parse request bodies and query strings in Dart, as well multipart/form-data uploads. No external dependencies required. + +This is the request body parser powering the [Angel3 framework](https://pub.dev/packages/angel3_framework). If you are looking for a server-side solution with dependency injection, WebSockets, and more, then I highly recommend it as your first choice. Bam! + +## Contents + +- [Belatuk Body Parser](#belatuk-body-parser) + - [Contents](#contents) + - [About](#about) + - [Installation](#installation) + - [Usage](#usage) + - [Custom Body Parsing](#custom-body-parsing) + +### About + +This package is similar to Express.js's `body-parser` module. It fully supports JSON, x-www-form-urlencoded as well as query strings requests. You can also include arrays in your query, in the same way you would for a PHP application. A benefit of this is that primitive types are automatically deserialized correctly. As in, if you have a `hello=1.5` request, then `body['hello']` will equal `1.5` and not `'1.5'`. + +### Installation + +To install Body Parser for your Dart project, simply add body_parser to your pub dependencies. + + dependencies: + platform_body_parser: ^5.2.0 + +### Usage + +Body Parser exposes a simple class called `BodyParseResult`. You can easily parse the query string and request body for a request by calling `Future parseBody`. + + ```dart + import 'dart:convert'; + import 'package:platform_body_parser/platform_body_parser.dart'; + + main() async { + // ... + await for (HttpRequest request in server) { + request.response.write(JSON.encode(await parseBody(request).body)); + await request.response.close(); + } + } + ``` + +You can also use `buildMapFromUri(Map, String)` to populate a map from a URL encoded string. + +This can easily be used with a library like [Angel3 JSON God](https://pub.dev/packages/angel3_json_god) to build structured JSON/REST APIs. Add validation and you've got an instant backend. + + ```dart + MyClass create(HttpRequest request) async { + return god.deserialize(await parseBody(request).body, MyClass); + } + ``` + +### Custom Body Parsing + +In cases where you need to parse unrecognized content types, `body_parser` won't be of any help to you on its own. However, you can use the `originalBuffer` property of a `BodyParseResult` to see the original request buffer. To get this functionality, pass `storeOriginalBuffer` as `true` when calling `parseBody`. + +For example, if you wanted to [parse GraphQL queries within your server](https://github.com/dukefirehawk/graphql_dart)... + + ```dart + app.get('/graphql', (req, res) async { + if (req.headers.contentType.mimeType == 'application/graphql') { + var graphQlString = String.fromCharCodes(req.originalBuffer); + // ... + } + }); + ``` diff --git a/common/body_parser/analysis_options.yaml b/common/body_parser/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/body_parser/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/body_parser/example/main.dart b/common/body_parser/example/main.dart new file mode 100644 index 0000000..4b92f18 --- /dev/null +++ b/common/body_parser/example/main.dart @@ -0,0 +1,61 @@ +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; +import 'dart:isolate'; +import 'package:http_parser/http_parser.dart'; +import 'package:platform_body_parser/body_parser.dart'; + +void main() async { + var address = '127.0.0.1'; + var port = 3000; + var futures = []; + + for (var i = 1; i < Platform.numberOfProcessors; i++) { + futures.add(Isolate.spawn(start, [address, port, i])); + } + + await Future.wait(futures).then((_) { + print('All instances started.'); + print( + 'Test with "wrk -t12 -c400 -d30s -s ./example/post.lua http://localhost:3000" or similar'); + start([address, port, 0]); + }); +} + +void start(List args) { + var address = InternetAddress(args[0] as String); + var port = 8080; + if (args[1] is int) { + args[1]; + } + + var id = 0; + if (args[2] is int) { + args[2]; + } + + HttpServer.bind(address, port, shared: true).then((server) { + server.listen((request) async { + // ignore: deprecated_member_use + var body = await defaultParseBody(request); + request.response + ..headers.contentType = ContentType('application', 'json') + ..write(json.encode(body.body)); + await request.response.close(); + }); + + print( + 'Server #$id listening at http://${server.address.address}:${server.port}'); + }); +} + +Future defaultParseBody(HttpRequest request, + {bool storeOriginalBuffer = false}) { + return parseBodyFromStream( + request, + request.headers.contentType != null + ? MediaType.parse(request.headers.contentType.toString()) + : null, + request.uri, + storeOriginalBuffer: storeOriginalBuffer); +} diff --git a/common/body_parser/example/post.lua b/common/body_parser/example/post.lua new file mode 100644 index 0000000..524febc --- /dev/null +++ b/common/body_parser/example/post.lua @@ -0,0 +1,6 @@ +-- example HTTP POST script which demonstrates setting the +-- HTTP method, body, and adding a header + +wrk.method = "POST" +wrk.body = "foo=bar&baz=quux" +wrk.headers["Content-Type"] = "application/x-www-form-urlencoded" \ No newline at end of file diff --git a/common/body_parser/lib/body_parser.dart b/common/body_parser/lib/body_parser.dart new file mode 100644 index 0000000..b80e7dd --- /dev/null +++ b/common/body_parser/lib/body_parser.dart @@ -0,0 +1,6 @@ +/// A library for parsing HTTP request bodies and queries. +library platform_body_parser; + +export 'src/body_parse_result.dart'; +export 'src/file_upload_info.dart'; +export 'src/parse_body.dart'; diff --git a/common/body_parser/lib/src/body_parse_result.dart b/common/body_parser/lib/src/body_parse_result.dart new file mode 100644 index 0000000..e47fc89 --- /dev/null +++ b/common/body_parser/lib/src/body_parse_result.dart @@ -0,0 +1,28 @@ +import 'file_upload_info.dart'; + +/// A representation of data from an incoming request. +abstract class BodyParseResult { + /// The parsed body. + Map get body; + + /// The parsed query string. + Map get query; + + /// All files uploaded within this request. + List get files; + + /// The original body bytes sent with this request. + /// + /// You must set [storeOriginalBuffer] to `true` to see this. + List? get originalBuffer; + + /// If an error was encountered while parsing the body, it will appear here. + /// + /// Otherwise, this is `null`. + dynamic get error; + + /// If an error was encountered while parsing the body, the call stack will appear here. + /// + /// Otherwise, this is `null`. + StackTrace? get stack; +} diff --git a/common/body_parser/lib/src/chunk.dart b/common/body_parser/lib/src/chunk.dart new file mode 100644 index 0000000..90c078b --- /dev/null +++ b/common/body_parser/lib/src/chunk.dart @@ -0,0 +1,7 @@ +import 'file_upload_info.dart'; + +List getFileDataFromChunk( + String chunk, String boundary, String fileUploadName, Map body) { + var result = []; + return result; +} diff --git a/common/body_parser/lib/src/file_upload_info.dart b/common/body_parser/lib/src/file_upload_info.dart new file mode 100644 index 0000000..285d72d --- /dev/null +++ b/common/body_parser/lib/src/file_upload_info.dart @@ -0,0 +1,17 @@ +/// Represents a file uploaded to the server. +class FileUploadInfo { + /// The MIME type of the uploaded file. + String? mimeType; + + /// The name of the file field from the request. + String? name; + + /// The filename of the file. + String? filename; + + /// The bytes that make up this file. + List data; + + FileUploadInfo( + {this.mimeType, this.name, this.filename, this.data = const []}); +} diff --git a/common/body_parser/lib/src/get_value.dart b/common/body_parser/lib/src/get_value.dart new file mode 100644 index 0000000..8d2106e --- /dev/null +++ b/common/body_parser/lib/src/get_value.dart @@ -0,0 +1,22 @@ +import 'dart:convert'; + +dynamic getValue(String value) { + try { + var numValue = num.parse(value); + if (!numValue.isNaN) { + return numValue; + } else { + return value; + } + } on FormatException { + if (value.startsWith('[') && value.endsWith(']')) { + return json.decode(value); + } else if (value.startsWith('{') && value.endsWith('}')) { + return json.decode(value); + } else if (value.trim().toLowerCase() == 'null') { + return null; + } else { + return value; + } + } +} diff --git a/common/body_parser/lib/src/map_from_uri.dart b/common/body_parser/lib/src/map_from_uri.dart new file mode 100644 index 0000000..d9ed8e2 --- /dev/null +++ b/common/body_parser/lib/src/map_from_uri.dart @@ -0,0 +1,44 @@ +import 'get_value.dart'; + +/// Parses a URI-encoded string into real data! **Wow!** +/// +/// Whichever map you provide will be automatically populated from the urlencoded body string you provide. +void buildMapFromUri(Map map, String body) { + var parseArrayRgx = RegExp(r'^(.+)\[\]$'); + + for (var keyValuePair in body.split('&')) { + if (keyValuePair.contains('=')) { + var equals = keyValuePair.indexOf('='); + var key = Uri.decodeQueryComponent(keyValuePair.substring(0, equals)); + var value = Uri.decodeQueryComponent(keyValuePair.substring(equals + 1)); + + if (parseArrayRgx.hasMatch(key)) { + Match queryMatch = parseArrayRgx.firstMatch(key)!; + key = queryMatch.group(1)!; + if (map[key] is! List) { + map[key] = []; + } + + map[key].add(getValue(value)); + } else if (key.contains('.')) { + // i.e. map.foo.bar => [map, foo, bar] + var keys = key.split('.'); + + var targetMap = map[keys[0]] != null ? map[keys[0]] as Map? : {}; + map[keys[0]] = targetMap; + for (var i = 1; i < keys.length; i++) { + if (i < keys.length - 1) { + targetMap![keys[i]] = targetMap[keys[i]] ?? {}; + targetMap = targetMap[keys[i]] as Map?; + } else { + targetMap![keys[i]] = getValue(value); + } + } + } else { + map[key] = getValue(value); + } + } else { + map[Uri.decodeQueryComponent(keyValuePair)] = true; + } + } +} diff --git a/common/body_parser/lib/src/parse_body.dart b/common/body_parser/lib/src/parse_body.dart new file mode 100644 index 0000000..f213500 --- /dev/null +++ b/common/body_parser/lib/src/parse_body.dart @@ -0,0 +1,147 @@ +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; +import 'dart:typed_data'; + +import 'package:http_parser/http_parser.dart'; +import 'package:platform_http_server/http_server.dart'; +import 'package:mime/mime.dart'; + +import 'body_parse_result.dart'; +import 'file_upload_info.dart'; +import 'map_from_uri.dart'; + +/// Forwards to [parseBodyFromStream]. +@Deprecated("parseBodyFromStream") +Future parseBody(HttpRequest request, + {bool storeOriginalBuffer = false}) { + return parseBodyFromStream( + request, + request.headers.contentType != null + ? MediaType.parse(request.headers.contentType.toString()) + : null, + request.uri, + storeOriginalBuffer: storeOriginalBuffer); +} + +/// Grabs data from an incoming request. +/// +/// Supports URL-encoded and JSON, as well as multipart/* forms. +/// On a file upload request, only fields with the name **'file'** are processed +/// as files. Anything else is put in the body. You can change the upload file name +/// via the *fileUploadName* parameter. :) +/// +/// Use [storeOriginalBuffer] to add the original request bytes to the result. +Future parseBodyFromStream( + Stream data, MediaType? contentType, Uri requestUri, + {bool storeOriginalBuffer = false}) async { + var result = _BodyParseResultImpl(); + + Future getBytes() { + return data + .fold(BytesBuilder(copy: false), (a, b) => a..add(b)) + .then((b) => b.takeBytes()); + } + + Future getBody() { + if (storeOriginalBuffer) { + return getBytes().then((bytes) { + result.originalBuffer = bytes; + return utf8.decode(bytes); + }); + } else { + return utf8.decoder.bind(data).join(); + } + } + + try { + if (contentType != null) { + if (contentType.type == 'multipart' && + contentType.parameters.containsKey('boundary')) { + Stream stream; + + if (storeOriginalBuffer) { + var bytes = result.originalBuffer = await getBytes(); + var ctrl = StreamController()..add(bytes); + await ctrl.close(); + stream = ctrl.stream; + } else { + stream = data; + } + + var parts = + MimeMultipartTransformer(contentType.parameters['boundary']!) + .bind(stream) + .map((part) => + HttpMultipartFormData.parse(part, defaultEncoding: utf8)); + + await for (HttpMultipartFormData part in parts) { + if (part.isBinary || + part.contentDisposition.parameters.containsKey('filename')) { + var builder = await part.fold( + BytesBuilder(copy: false), + (BytesBuilder b, d) => + b..add(d is! String ? (d as List?)! : d.codeUnits)); + var upload = FileUploadInfo( + mimeType: part.contentType!.mimeType, + name: part.contentDisposition.parameters['name'], + filename: + part.contentDisposition.parameters['filename'] ?? 'file', + data: builder.takeBytes()); + result.files.add(upload); + } else if (part.isText) { + var text = await part.join(); + buildMapFromUri(result.body, + '${part.contentDisposition.parameters["name"]}=$text'); + } + } + } else if (contentType.mimeType == 'application/json') { + result.body.addAll( + _foldToStringDynamic(json.decode(await getBody()) as Map?)!); + } else if (contentType.mimeType == 'application/x-www-form-urlencoded') { + var body = await getBody(); + buildMapFromUri(result.body, body); + } else if (storeOriginalBuffer == true) { + result.originalBuffer = await getBytes(); + } + } else { + if (requestUri.hasQuery) { + buildMapFromUri(result.query, requestUri.query); + } + + if (storeOriginalBuffer == true) { + result.originalBuffer = await getBytes(); + } + } + } catch (e, st) { + result.error = e; + result.stack = st; + } + + return result; +} + +class _BodyParseResultImpl implements BodyParseResult { + @override + Map body = {}; + + @override + List files = []; + + @override + List? originalBuffer; + + @override + Map query = {}; + + @override + dynamic error; + + @override + StackTrace? stack; +} + +Map? _foldToStringDynamic(Map? map) { + return map?.keys.fold>( + {}, (out, k) => out..[k.toString()] = map[k]); +} diff --git a/common/body_parser/pubspec.lock b/common/body_parser/pubspec.lock new file mode 100644 index 0000000..7762360 --- /dev/null +++ b/common/body_parser/pubspec.lock @@ -0,0 +1,417 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http: + dependency: "direct dev" + description: + name: http + sha256: b9c29a161230ee03d3ccf545097fccd9b87a5264228c5d348202e0f0c28f9010 + url: "https://pub.dev" + source: hosted + version: "1.2.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: "direct main" + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: "direct main" + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + platform_http_server: + dependency: "direct main" + description: + path: "../http_server" + relative: true + source: path + version: "4.5.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/body_parser/pubspec.yaml b/common/body_parser/pubspec.yaml new file mode 100644 index 0000000..e6f7a34 --- /dev/null +++ b/common/body_parser/pubspec.yaml @@ -0,0 +1,14 @@ +name: platform_body_parser +version: 5.3.0 +description: Parse request bodies and query strings in Dart. Supports JSON, URL-encoded, and multi-part bodies. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/body_parser +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + http_parser: ^4.0.0 + platform_http_server: ^4.4.0 + mime: ^2.0.0 +dev_dependencies: + http: ^1.0.0 + test: ^1.24.0 + lints: ^4.0.0 \ No newline at end of file diff --git a/common/body_parser/test/form_data_test.dart b/common/body_parser/test/form_data_test.dart new file mode 100644 index 0000000..0f74878 --- /dev/null +++ b/common/body_parser/test/form_data_test.dart @@ -0,0 +1,154 @@ +import 'dart:io'; +import 'dart:convert'; +import 'package:platform_body_parser/body_parser.dart'; +import 'package:http/http.dart' as http; +import 'package:http_parser/http_parser.dart'; +import 'package:test/test.dart'; +import 'server_test.dart'; + +Future _parseBody(HttpRequest request) { + return parseBodyFromStream( + request, + request.headers.contentType != null + ? MediaType.parse(request.headers.contentType.toString()) + : null, + request.uri, + storeOriginalBuffer: false); +} + +void main() { + HttpServer? server; + String? url; + http.Client? client; + + setUp(() async { + server = await HttpServer.bind('127.0.0.1', 0); + server!.listen((HttpRequest request) async { + //Server will simply return a JSON representation of the parsed body + // ignore: deprecated_member_use + request.response.write(jsonEncodeBody(await _parseBody(request))); + await request.response.close(); + }); + url = 'http://localhost:${server!.port}'; + print('Test server listening on $url'); + client = http.Client(); + }); + + tearDown(() async { + await server!.close(force: true); + client!.close(); + server = null; + url = null; + client = null; + }); + + test('No upload', () async { + var boundary = 'myBoundary'; + var headers = { + 'content-type': 'multipart/form-data; boundary=$boundary' + }; + var postData = ''' +--$boundary +Content-Disposition: form-data; name="hello" + +world +--$boundary-- +''' + .replaceAll('\n', '\r\n'); + + print( + 'Form Data: \n${postData.replaceAll("\r", "\\r").replaceAll("\n", "\\n")}'); + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + print('Response: ${response.body}'); + var jsons = json.decode(response.body); + var files = jsons['files'].map((map) { + return map.keys.fold>( + {}, (out, k) => out..[k.toString()] = map[k]); + }); + expect(files.length, equals(0)); + expect(jsons['body']['hello'], equals('world')); + }); + + test('Single upload', () async { + var boundary = 'myBoundary'; + var headers = { + 'content-type': ContentType('multipart', 'form-data', + parameters: {'boundary': boundary}).toString() + }; + var postData = ''' +--$boundary +Content-Disposition: form-data; name="hello" + +world +--$boundary +Content-Disposition: form-data; name="file"; filename="app.dart" +Content-Type: application/dart + +Hello world +--$boundary-- +''' + .replaceAll('\n', '\r\n'); + + print( + 'Form Data: \n${postData.replaceAll("\r", "\\r").replaceAll("\n", "\\n")}'); + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + print('Response: ${response.body}'); + var jsons = json.decode(response.body); + var files = jsons['files']; + expect(files.length, equals(1)); + expect(files[0]['name'], equals('file')); + expect(files[0]['mimeType'], equals('application/dart')); + expect(files[0]['data'].length, equals(11)); + expect(files[0]['filename'], equals('app.dart')); + expect(jsons['body']['hello'], equals('world')); + }); + + test('Multiple upload', () async { + var boundary = 'myBoundary'; + var headers = { + 'content-type': 'multipart/form-data; boundary=$boundary' + }; + var postData = ''' +--$boundary +Content-Disposition: form-data; name="json" + +god +--$boundary +Content-Disposition: form-data; name="num" + +14.50000 +--$boundary +Content-Disposition: form-data; name="file"; filename="app.dart" +Content-Type: text/plain + +Hello world +--$boundary +Content-Disposition: form-data; name="entry-point"; filename="main.js" +Content-Type: text/javascript + +function main() { + console.log("Hello, world!"); +} +--$boundary-- +''' + .replaceAll('\n', '\r\n'); + + print( + 'Form Data: \n${postData.replaceAll("\r", "\\r").replaceAll("\n", "\\n")}'); + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + print('Response: ${response.body}'); + var jsons = json.decode(response.body); + var files = jsons['files']; + expect(files.length, equals(2)); + expect(files[0]['name'], equals('file')); + expect(files[0]['mimeType'], equals('text/plain')); + expect(files[0]['data'].length, equals(11)); + expect(files[1]['name'], equals('entry-point')); + expect(files[1]['mimeType'], equals('text/javascript')); + expect(jsons['body']['json'], equals('god')); + expect(jsons['body']['num'], equals(14.5)); + }); +} diff --git a/common/body_parser/test/server_test.dart b/common/body_parser/test/server_test.dart new file mode 100644 index 0000000..df12a80 --- /dev/null +++ b/common/body_parser/test/server_test.dart @@ -0,0 +1,174 @@ +import 'dart:convert'; +import 'dart:io' show HttpRequest, HttpServer; + +import 'package:platform_body_parser/body_parser.dart'; +import 'package:http/http.dart' as http; +import 'package:http_parser/http_parser.dart'; +import 'package:test/test.dart'; + +const token = + 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhdWQiOiIxMjcuMC4wLjEiLCJleHAiOi0xLCJpYXQiOiIyMDE2LTEyLTIyVDEyOjQ5OjUwLjM2MTQ0NiIsImlzcyI6ImFuZ2VsX2F1dGgiLCJzdWIiOiIxMDY2OTQ4Mzk2MDIwMjg5ODM2NTYifQ==.PYw7yUb-cFWD7N0sSLztP7eeRvO44nu1J2OgDNyT060='; + +String jsonEncodeBody(BodyParseResult result) { + return json.encode({ + 'query': result.query, + 'body': result.body, + 'error': result.error?.toString(), + 'files': result.files.map((f) { + return { + 'name': f.name, + 'mimeType': f.mimeType, + 'filename': f.filename, + 'data': f.data, + }; + }).toList(), + 'originalBuffer': result.originalBuffer, + 'stack': null, //result.stack.toString(), + }); +} + +Future _parseBody(HttpRequest request) { + return parseBodyFromStream( + request, + request.headers.contentType != null + ? MediaType.parse(request.headers.contentType.toString()) + : null, + request.uri, + storeOriginalBuffer: true); +} + +void main() { + HttpServer? server; + String? url; + http.Client? client; + + setUp(() async { + server = await HttpServer.bind('127.0.0.1', 0); + server!.listen((HttpRequest request) async { + //Server will simply return a JSON representation of the parsed body + request.response.write( + // ignore: deprecated_member_use + jsonEncodeBody(await _parseBody(request))); + await request.response.close(); + }); + url = 'http://localhost:${server!.port}'; + print('Test server listening on $url'); + client = http.Client(); + }); + tearDown(() async { + await server!.close(force: true); + client!.close(); + server = null; + url = null; + client = null; + }); + + group('query string', () { + test('GET Simple', () async { + print('GET $url/?hello=world'); + var response = await client!.get(Uri.parse('$url/?hello=world')); + print('Response: ${response.body}'); + var result = json.decode(response.body); + expect(result['body'], equals({})); + expect(result['query'], equals({'hello': 'world'})); + expect(result['files'], equals([])); + //expect(result['originalBuffer'], isNull); + }); + + test('GET Complex', () async { + var postData = + 'hello=world&nums%5B%5D=1&nums%5B%5D=2.0&nums%5B%5D=${3 - 1}&map.foo.bar=baz'; + print('Body: $postData'); + var response = await client!.get(Uri.parse('$url/?$postData')); + print('Response: ${response.body}'); + var query = json.decode(response.body)['query']; + expect(query['hello'], equals('world')); + expect(query['nums'][2], equals(2)); + expect(query['map'] is Map, equals(true)); + expect(query['map']['foo'], equals({'bar': 'baz'})); + }); + + test('JWT', () async { + var postData = 'token=$token'; + print('Body: $postData'); + var response = await client!.get(Uri.parse('$url/?$postData')); + print('Response: ${response.body}'); + var query = json.decode(response.body)['query']; + expect(query['token'], equals(token)); + }); + }); + + group('urlencoded', () { + var headers = { + 'content-type': 'application/x-www-form-urlencoded' + }; + test('POST Simple', () async { + print('Body: hello=world'); + var response = await client! + .post(Uri.parse(url!), headers: headers, body: 'hello=world'); + print('Response: ${response.body}'); + var result = json.decode(response.body); + expect(result['query'], equals({})); + expect(result['body'], equals({'hello': 'world'})); + expect(result['files'], equals([])); + expect(result['originalBuffer'], isList); + expect(result['originalBuffer'], isNotEmpty); + }); + + test('Post Complex', () async { + var postData = + 'hello=world&nums%5B%5D=1&nums%5B%5D=2.0&nums%5B%5D=${3 - 1}&map.foo.bar=baz'; + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + print('Response: ${response.body}'); + var body = json.decode(response.body)['body']; + expect(body['hello'], equals('world')); + expect(body['nums'][2], equals(2)); + expect(body['map'] is Map, equals(true)); + expect(body['map']['foo'], equals({'bar': 'baz'})); + }); + + test('JWT', () async { + var postData = 'token=$token'; + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + var body = json.decode(response.body)['body']; + expect(body['token'], equals(token)); + }); + }); + + group('json', () { + var headers = {'content-type': 'application/json'}; + test('Post Simple', () async { + var postData = json.encode({'hello': 'world'}); + print('Body: $postData'); + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + print('Response: ${response.body}'); + var result = json.decode(response.body); + expect(result['body'], equals({'hello': 'world'})); + expect(result['query'], equals({})); + expect(result['files'], equals([])); + expect(result['originalBuffer'], allOf(isList, isNotEmpty)); + }); + + test('Post Complex', () async { + var postData = json.encode({ + 'hello': 'world', + 'nums': [1, 2.0, 3 - 1], + 'map': { + 'foo': {'bar': 'baz'} + } + }); + print('Body: $postData'); + var response = + await client!.post(Uri.parse(url!), headers: headers, body: postData); + print('Response: ${response.body}'); + var body = json.decode(response.body)['body']; + expect(body['hello'], equals('world')); + expect(body['nums'][2], equals(2)); + expect(body['map'] is Map, equals(true)); + expect(body['map']['foo'], equals({'bar': 'baz'})); + }); + }); +} diff --git a/common/code_buffer/AUTHORS.md b/common/code_buffer/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/code_buffer/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/code_buffer/CHANGELOG.md b/common/code_buffer/CHANGELOG.md new file mode 100644 index 0000000..1e043c3 --- /dev/null +++ b/common/code_buffer/CHANGELOG.md @@ -0,0 +1,55 @@ +# Change Log + +## 5.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 5.1.0 + +* Updated `lints` to 3.0.0 + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 + +## 4.0.0 + +* Require Dart >= 2.17 + +## 3.0.2 + +* Fixed license link + +## 3.0.1 + +* Updated README + +## 3.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_code_buffer` package + +## 2.0.3 + +* Resolved static analysis warnings + +## 2.0.2 + +* Updated README + +## 2.0.1 + +* Fixed invalid homepage url in pubspec.yaml + +## 2.0.0 + +* Migrated to support Dart SDK 2.12.x NNBD + +## 1.0.1 + +* Added `CodeBuffer.noWhitespace()`. diff --git a/common/code_buffer/LICENSE b/common/code_buffer/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/code_buffer/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/code_buffer/README.md b/common/code_buffer/README.md new file mode 100644 index 0000000..e9f1560 --- /dev/null +++ b/common/code_buffer/README.md @@ -0,0 +1,69 @@ +# Belatuk Code Buffer + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_code_buffer?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/code_buffer/LICENSE) + +**Replacement of `package:code_buffer` with breaking changes to support NNBD.** + +An advanced StringBuffer geared toward generating code, and source maps. + +## Installation + +In your `pubspec.yaml`: + +```yaml +dependencies: + platform_code_buffer: ^5.1.0 +``` + +## Usage + +Use a `CodeBuffer` just like any regular `StringBuffer`: + +```dart +String someFunc() { + var buf = CodeBuffer(); + buf + ..write('hello ') + ..writeln('world!'); + return buf.toString(); +} +``` + +However, a `CodeBuffer` supports indentation. + +```dart +void someOtherFunc() { + var buf = CodeBuffer(); + // Custom options... + var buf = CodeBuffer(newline: '\r\n', space: '\t', trailingNewline: true); + + // Any following lines will have an incremented indentation level... + buf.indent(); + + // And vice-versa: + buf.outdent(); +} +``` + +`CodeBuffer` instances keep track of every `SourceSpan` they create. +This makes them useful for codegen tools, or to-JS compilers. + +```dart +void someFunc(CodeBuffer buf) { + buf.write('hello'); + expect(buf.lastLine.text, 'hello'); + + buf.writeln('world'); + expect(buf.lastLine.lastSpan.start.column, 5); +} +``` + +You can copy a `CodeBuffer` into another, heeding indentation rules: + +```dart +void yetAnotherFunc(CodeBuffer a, CodeBuffer b) { + b.copyInto(a); +} +``` diff --git a/common/code_buffer/analysis_options.yaml b/common/code_buffer/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/code_buffer/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/code_buffer/example/main.dart b/common/code_buffer/example/main.dart new file mode 100644 index 0000000..7d4b5bb --- /dev/null +++ b/common/code_buffer/example/main.dart @@ -0,0 +1,46 @@ +import 'package:platform_code_buffer/code_buffer.dart'; +import 'package:test/test.dart'; + +/// Use a `CodeBuffer` just like any regular `StringBuffer`: +String someFunc() { + var buf = CodeBuffer(); + buf + ..write('hello ') + ..writeln('world!'); + return buf.toString(); +} + +/// However, a `CodeBuffer` supports indentation. +void someOtherFunc() { + var buf = CodeBuffer(); + + // Custom options... + // ignore: unused_local_variable + var customBuf = + CodeBuffer(newline: '\r\n', space: '\t', trailingNewline: true); + + // Without whitespace.. + // ignore: unused_local_variable + var minifyingBuf = CodeBuffer.noWhitespace(); + + // Any following lines will have an incremented indentation level... + buf.indent(); + + // And vice-versa: + buf.outdent(); +} + +/// `CodeBuffer` instances keep track of every `SourceSpan` they create. +//This makes them useful for codegen tools, or to-JS compilers. +void yetAnotherOtherFunc(CodeBuffer buf) { + buf.write('hello'); + expect(buf.lastLine!.text, 'hello'); + + buf.writeln('world'); + expect(buf.lastLine!.lastSpan!.start.column, 5); +} + +/// You can copy a `CodeBuffer` into another, heeding indentation rules: +void yetEvenAnotherFunc(CodeBuffer a, CodeBuffer b) { + b.copyInto(a); +} diff --git a/common/code_buffer/lib/code_buffer.dart b/common/code_buffer/lib/code_buffer.dart new file mode 100644 index 0000000..2e377c5 --- /dev/null +++ b/common/code_buffer/lib/code_buffer.dart @@ -0,0 +1,231 @@ +import 'package:source_span/source_span.dart'; + +/// An advanced StringBuffer geared toward generating code, and source maps. +class CodeBuffer implements StringBuffer { + /// The character sequence used to represent a line break. + final String newline; + + /// The character sequence used to represent a space/tab. + final String space; + + /// The source URL to be applied to all generated [SourceSpan] instances. + final dynamic sourceUrl; + + /// If `true` (default: `false`), then an additional [newline] will be inserted at the end of the generated string. + final bool trailingNewline; + + final List _lines = []; + CodeBufferLine? _currentLine, _lastLine; + int _indentationLevel = 0; + int _length = 0; + + CodeBuffer( + {this.space = ' ', + this.newline = '\n', + this.trailingNewline = false, + this.sourceUrl}); + + /// Creates a [CodeBuffer] that does not emit additional whitespace. + factory CodeBuffer.noWhitespace({sourceUrl}) => CodeBuffer( + space: '', newline: '', trailingNewline: false, sourceUrl: sourceUrl); + + /// The last line created within this buffer. + CodeBufferLine? get lastLine => _lastLine; + + /// Returns an immutable collection of the [CodeBufferLine]s within this instance. + List get lines => List.unmodifiable(_lines); + + @override + bool get isEmpty => _lines.isEmpty; + + @override + bool get isNotEmpty => _lines.isNotEmpty; + + @override + int get length => _length; + + CodeBufferLine _createLine() { + var start = SourceLocation( + _length, + sourceUrl: sourceUrl, + line: _lines.length, + column: _indentationLevel * space.length, + ); + var line = CodeBufferLine._(_indentationLevel, start).._end = start; + _lines.add(_lastLine = line); + return line; + } + + /// Increments the indentation level. + void indent() { + _indentationLevel++; + } + + /// Decrements the indentation level, if it is greater than `0`. + void outdent() { + if (_indentationLevel > 0) _indentationLevel--; + } + + /// Copies the contents of this [CodeBuffer] into another, preserving indentation and source mapping information. + void copyInto(CodeBuffer other) { + if (_lines.isEmpty) return; + var i = 0; + + for (var line in _lines) { + // To compute offset: + // 1. Find current length of other + // 2. Add length of its newline + // 3. Add indentation + var column = (other._indentationLevel + line.indentationLevel) * + other.space.length; + var offset = other._length + other.newline.length + column; + + // Re-compute start + end + var start = SourceLocation( + offset, + sourceUrl: other.sourceUrl, + line: other._lines.length + i, + column: column, + ); + + var end = SourceLocation( + offset + line.span.length, + sourceUrl: other.sourceUrl, + line: start.line, + column: column + line._buf.length, + ); + + var clone = CodeBufferLine._( + line.indentationLevel + other._indentationLevel, start) + .._end = end + .._buf.write(line._buf.toString()); + + // Adjust lastSpan + if (line._lastSpan != null) { + var s = line._lastSpan!.start; + var lastSpanColumn = + ((line.indentationLevel + other._indentationLevel) * + other.space.length) + + line.text.indexOf(line._lastSpan!.text); + clone._lastSpan = SourceSpan( + SourceLocation( + offset + s.offset, + sourceUrl: other.sourceUrl, + line: clone.span.start.line, + column: lastSpanColumn, + ), + SourceLocation( + offset + s.offset + line._lastSpan!.length, + sourceUrl: other.sourceUrl, + line: clone.span.end.line, + column: lastSpanColumn + line._lastSpan!.length, + ), + line._lastSpan!.text, + ); + } + + other._lines.add(other._currentLine = other._lastLine = clone); + + // Adjust length accordingly... + other._length = offset + clone.span.length; + i++; + } + + other.writeln(); + } + + @override + void clear() { + _lines.clear(); + _length = _indentationLevel = 0; + _currentLine = null; + } + + @override + void writeCharCode(int charCode) { + _currentLine ??= _createLine(); + + _currentLine!._buf.writeCharCode(charCode); + var end = _currentLine!._end; + _currentLine!._end = SourceLocation( + end.offset + 1, + sourceUrl: end.sourceUrl, + line: end.line, + column: end.column + 1, + ); + _length++; + _currentLine!._lastSpan = + SourceSpan(end, _currentLine!._end, String.fromCharCode(charCode)); + } + + @override + void write(Object? obj) { + var msg = obj.toString(); + _currentLine ??= _createLine(); + _currentLine!._buf.write(msg); + var end = _currentLine!._end; + _currentLine!._end = SourceLocation( + end.offset + msg.length, + sourceUrl: end.sourceUrl, + line: end.line, + column: end.column + msg.length, + ); + _length += msg.length; + _currentLine!._lastSpan = SourceSpan(end, _currentLine!._end, msg); + } + + @override + void writeln([Object? obj = '']) { + if (obj != null && obj != '') write(obj); + _currentLine = null; + _length++; + } + + @override + void writeAll(Iterable objects, [String separator = '']) { + write(objects.join(separator)); + } + + @override + String toString() { + var buf = StringBuffer(); + var i = 0; + + for (var line in lines) { + if (i++ > 0) buf.write(newline); + for (var j = 0; j < line.indentationLevel; j++) { + buf.write(space); + } + buf.write(line._buf.toString()); + } + + if (trailingNewline == true) buf.write(newline); + + return buf.toString(); + } +} + +/// Represents a line of text within a [CodeBuffer]. +class CodeBufferLine { + /// Mappings from one [SourceSpan] to another, to aid with generating dynamic source maps. + final Map sourceMappings = {}; + + /// The level of indentation preceding this line. + final int indentationLevel; + + final SourceLocation _start; + final StringBuffer _buf = StringBuffer(); + late SourceLocation _end; + SourceSpan? _lastSpan; + + CodeBufferLine._(this.indentationLevel, this._start); + + /// The [SourceSpan] corresponding to the last text written to this line. + SourceSpan? get lastSpan => _lastSpan; + + /// The [SourceSpan] corresponding to this entire line. + SourceSpan get span => SourceSpan(_start, _end, _buf.toString()); + + /// The text within this line. + String get text => _buf.toString(); +} diff --git a/common/code_buffer/pubspec.lock b/common/code_buffer/pubspec.lock new file mode 100644 index 0000000..e523359 --- /dev/null +++ b/common/code_buffer/pubspec.lock @@ -0,0 +1,410 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + charcode: + dependency: "direct main" + description: + name: charcode + sha256: fb0f1107cac15a5ea6ef0a6ef71a807b9e4267c713bb93e00e92d737cc8dbd8a + url: "https://pub.dev" + source: hosted + version: "1.4.0" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: "direct main" + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/code_buffer/pubspec.yaml b/common/code_buffer/pubspec.yaml new file mode 100644 index 0000000..71757c4 --- /dev/null +++ b/common/code_buffer/pubspec.yaml @@ -0,0 +1,12 @@ +name: platform_code_buffer +version: 5.2.0 +description: An advanced StringBuffer geared toward generating code, and source maps. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/code_buffer +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + charcode: ^1.2.0 + source_span: ^1.8.1 +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/code_buffer/test/copy_test.dart b/common/code_buffer/test/copy_test.dart new file mode 100644 index 0000000..8085d79 --- /dev/null +++ b/common/code_buffer/test/copy_test.dart @@ -0,0 +1,47 @@ +import 'package:platform_code_buffer/code_buffer.dart'; +import 'package:test/test.dart'; + +void main() { + var a = CodeBuffer(), b = CodeBuffer(); + + setUp(() { + a.writeln('outer block 1'); + b + ..writeln('inner block 1') + ..writeln('inner block 2'); + b.copyInto(a..indent()); + a + ..outdent() + ..writeln('outer block 2'); + }); + + tearDown(() { + a.clear(); + b.clear(); + }); + + test('sets correct text', () { + expect( + a.toString(), + [ + 'outer block 1', + ' inner block 1', + ' inner block 2', + 'outer block 2', + ].join('\n')); + }); + + test('sets lastLine+lastSpan', () { + var c = CodeBuffer() + ..indent() + ..write('>') + ..writeln('innermost'); + c.copyInto(a); + expect(a.lastLine!.text, '>innermost'); + expect(a.lastLine!.span.start.column, 2); + expect(a.lastLine!.lastSpan!.start.line, 4); + expect(a.lastLine!.lastSpan!.start.column, 3); + expect(a.lastLine!.lastSpan!.end.line, 4); + expect(a.lastLine!.lastSpan!.end.column, 12); + }); +} diff --git a/common/code_buffer/test/span_test.dart b/common/code_buffer/test/span_test.dart new file mode 100644 index 0000000..f812fa2 --- /dev/null +++ b/common/code_buffer/test/span_test.dart @@ -0,0 +1,46 @@ +import 'package:charcode/charcode.dart'; +import 'package:platform_code_buffer/code_buffer.dart'; +import 'package:test/test.dart'; + +void main() { + var buf = CodeBuffer(); + tearDown(buf.clear); + + test('writeCharCode', () { + buf.writeCharCode($x); + expect(buf.lastLine!.lastSpan!.start.column, 0); + expect(buf.lastLine!.lastSpan!.start.line, 0); + expect(buf.lastLine!.lastSpan!.end.column, 1); + expect(buf.lastLine!.lastSpan!.end.line, 0); + }); + + test('write', () { + buf.write('foo'); + expect(buf.lastLine!.lastSpan!.start.column, 0); + expect(buf.lastLine!.lastSpan!.start.line, 0); + expect(buf.lastLine!.lastSpan!.end.column, 3); + expect(buf.lastLine!.lastSpan!.end.line, 0); + }); + + test('multiple writes in one line', () { + buf + ..write('foo') + ..write('baz'); + expect(buf.lastLine!.lastSpan!.start.column, 3); + expect(buf.lastLine!.lastSpan!.start.line, 0); + expect(buf.lastLine!.lastSpan!.end.column, 6); + expect(buf.lastLine!.lastSpan!.end.line, 0); + }); + + test('multiple lines', () { + buf + ..writeln('foo') + ..write('bar') + ..write('+') + ..writeln('baz'); + expect(buf.lastLine!.lastSpan!.start.column, 4); + expect(buf.lastLine!.lastSpan!.start.line, 1); + expect(buf.lastLine!.lastSpan!.end.column, 7); + expect(buf.lastLine!.lastSpan!.end.line, 1); + }); +} diff --git a/common/code_buffer/test/write_test.dart b/common/code_buffer/test/write_test.dart new file mode 100644 index 0000000..b5e2f86 --- /dev/null +++ b/common/code_buffer/test/write_test.dart @@ -0,0 +1,90 @@ +import 'package:charcode/charcode.dart'; +import 'package:test/test.dart'; +import 'package:platform_code_buffer/code_buffer.dart'; + +void main() { + var buf = CodeBuffer(); + tearDown(buf.clear); + + test('writeCharCode', () { + buf.writeCharCode($x); + expect(buf.toString(), 'x'); + }); + + test('write', () { + buf.write('hello world'); + expect(buf.toString(), 'hello world'); + }); + + test('custom space', () { + var b = CodeBuffer(space: '+') + ..writeln('foo') + ..indent() + ..writeln('baz'); + expect(b.toString(), 'foo\n+baz'); + }); + + test('custom newline', () { + var b = CodeBuffer(newline: 'N') + ..writeln('foo') + ..indent() + ..writeln('baz'); + expect(b.toString(), 'fooN baz'); + }); + + test('trailing newline', () { + var b = CodeBuffer(trailingNewline: true)..writeln('foo'); + expect(b.toString(), 'foo\n'); + }); + + group('multiple lines', () { + setUp(() { + buf + ..writeln('foo') + ..writeln('bar') + ..writeln('baz'); + expect(buf.lines, hasLength(3)); + expect(buf.lines[0].text, 'foo'); + expect(buf.lines[1].text, 'bar'); + expect(buf.lines[2].text, 'baz'); + }); + }); + + test('indent', () { + buf + ..writeln('foo') + ..indent() + ..writeln('bar') + ..indent() + ..writeln('baz') + ..outdent() + ..writeln('quux') + ..outdent() + ..writeln('end'); + expect(buf.toString(), 'foo\n bar\n baz\n quux\nend'); + }); + + group('sets lastLine text', () { + test('writeCharCode', () { + buf.writeCharCode($x); + expect(buf.lastLine!.text, 'x'); + }); + + test('write', () { + buf.write('hello world'); + expect(buf.lastLine!.text, 'hello world'); + }); + }); + + group('sets lastLine lastSpan', () { + test('writeCharCode', () { + buf.writeCharCode($x); + expect(buf.lastLine!.lastSpan!.text, 'x'); + }); + + test('write', () { + buf.write('hello world'); + expect(buf.lastLine!.lastSpan!.text, 'hello world'); + }); + }); +} diff --git a/common/combinator/AUTHORS.md b/common/combinator/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/combinator/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/combinator/CHANGELOG.md b/common/combinator/CHANGELOG.md new file mode 100644 index 0000000..a4ced23 --- /dev/null +++ b/common/combinator/CHANGELOG.md @@ -0,0 +1,61 @@ +# Change Log + +## 5.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 5.1.0 + +* Updated `lints` to 3.0.0 +* Fixed lints warnings + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 +* Updated `platform_code_buffer` to 5.0.0 + +## 4.0.0 + +* Require Dart >= 2.17 + +## 3.0.1 + +* Fixed license link + +## 3.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_combinator` package +* Resolved static analysis warnings + +## 2.0.2 + +* Resolved static analysis warnings + +## 2.0.1 + +* Updated README + +## 2.0.0 + +* Migrated to support Dart SDK 2.12.x NNBD + +## 1.1.0 + +* Add `tupleX` parsers. Hooray for strong typing! + +## 1.0.0+3 + +* `then` now *always* returns `dynamic`. + +## 1.0.0+2 + +* `star` now includes with a call to `opt`. +* Added comments. +* Enforce generics on `separatedBy`. +* Enforce Dart 2 semantics. diff --git a/common/combinator/LICENSE b/common/combinator/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/combinator/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/combinator/README.md b/common/combinator/README.md new file mode 100644 index 0000000..8906ea9 --- /dev/null +++ b/common/combinator/README.md @@ -0,0 +1,129 @@ +# Belatuk Combinator + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_combinator?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/combinator/LICENSE) + +**Replacement of `package:combinator` with breaking changes to support NNBD.** + +Packrat parser combinators that support static typing, generics, file spans, memoization, and more. + +**RECOMMENDED:** +Check `example/` for examples. The examples contain examples of using: + +* Generic typing +* Reading `FileSpan` from `ParseResult` +* More... + +## Basic Usage + +```dart +void main() { + // Parse a Pattern (usually String or RegExp). + var foo = match('foo'); + var number = match(RegExp(r'[0-9]+'), errorMessage: 'Expected a number.'); + + // Set a value. + var numWithValue = number.map((r) => int.parse(r.span.text)); + + // Expect a pattern, or nothing. + var optional = numWithValue.opt(); + + // Expect a pattern zero or more times. + var star = optional.star(); + + // Expect one or more times. + var plus = optional.plus(); + + // Expect an arbitrary number of times. + var threeTimes = optional.times(3); + + // Expect a sequence of patterns. + var doraTheExplorer = chain([ + match('Dora').space(), + match('the').space(), + match('Explorer').space(), + ]); + + // Choose exactly one of a set of patterns, whichever + // appears first. + var alt = any([ + match('1'), + match('11'), + match('111'), + ]); + + // Choose the *longest* match for any of the given alternatives. + var alt2 = longest([ + match('1'), + match('11'), + match('111'), + ]); + + // Friendly operators + var fooOrNumber = foo | number; + var fooAndNumber = foo & number; + var notFoo = ~foo; +} +``` + +## Error Messages + +Parsers without descriptive error messages can lead to frustrating dead-ends +for end-users. Fortunately, `platform_combinator` is built with error handling in mind. + +```dart +void main(Parser parser) { + // Append an arbitrary error message to a parser if it is not matched. + var withError = parser.error(errorMessage: 'Hey!!! Wrong!!!'); + + // You can also set the severity of an error. + var asHint = parser.error(severity: SyntaxErrorSeverity.hint); + + // Constructs like `any`, `chain`, and `longest` support this as well. + var foo = longest([ + parser.error(errorMessage: 'foo'), + parser.error(errorMessage: 'bar') + ], errorMessage: 'Expected a "foo" or a "bar"'); + + // If multiple errors are present at one location, + // it can create a lot of noise. + // + // Use `foldErrors` to only take one error at a given location. + var lessNoise = parser.foldErrors(); +} +``` + +## Whitespaces + +Handling optional whitespace is dead-easy: + +```dart +void main(Parser parser) { + var optionalSpace = parser.space(); +} +``` + +## For Programming Languages + +`platform_combinator` was conceived to make writing parsers for complex grammars easier, +namely programming languages. Thus, there are functions built-in to make common constructs +easier: + +```dart +void main(Parser parser) { + var array = parser + .separatedByComma() + .surroundedBySquareBrackets(defaultValue: []); + + var braces = parser.surroundedByCurlyBraces(); + + var sep = parser.separatedBy(match('!').space()); +} +``` + +## Differences between this and Petitparser + +* `platform_combinator` makes extensive use of Dart's dynamic typing +* `platform_combinator` supports detailed error messages (with configurable severity) +* `platform_combinator` keeps track of locations (ex. `line 1: 3`) diff --git a/common/combinator/analysis_options.yaml b/common/combinator/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/combinator/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/combinator/combinator.iml b/common/combinator/combinator.iml new file mode 100644 index 0000000..75734c9 --- /dev/null +++ b/common/combinator/combinator.iml @@ -0,0 +1,14 @@ + + + + + + + + + + + + + + \ No newline at end of file diff --git a/common/combinator/example/basic_auth.dart b/common/combinator/example/basic_auth.dart new file mode 100644 index 0000000..7aa23af --- /dev/null +++ b/common/combinator/example/basic_auth.dart @@ -0,0 +1,56 @@ +// Run this with "Basic QWxhZGRpbjpPcGVuU2VzYW1l" + +import 'dart:convert'; +import 'dart:io'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +/// Parse a part of a decoded Basic auth string. +/// +/// Namely, the `username` or `password` in `{username}:{password}`. +final Parser string = + match(RegExp(r'[^:$]+'), errorMessage: 'Expected a string.') + .value((r) => r.span!.text); + +/// Transforms `{username}:{password}` to `{"username": username, "password": password}`. +final Parser> credentials = chain([ + string.opt(), + match(':'), + string.opt(), +]).map>( + (r) => {'username': r.value![0], 'password': r.value![2]}); + +/// We can actually embed a parser within another parser. +/// +/// This is used here to BASE64URL-decode a string, and then +/// parse the decoded string. +final Parser credentialString = match?>( + RegExp(r'([^\n$]+)'), + errorMessage: 'Expected a credential string.') + .value((r) { + var decoded = utf8.decode(base64Url.decode(r.span!.text)); + var scanner = SpanScanner(decoded); + return credentials.parse(scanner).value; +}); + +final Parser basic = match('Basic').space(); + +final Parser basicAuth = basic.then(credentialString).index(1); + +void main() { + while (true) { + stdout.write('Enter a basic auth value: '); + var line = stdin.readLineSync()!; + var scanner = SpanScanner(line, sourceUrl: 'stdin'); + var result = basicAuth.parse(scanner); + + if (!result.successful) { + for (var error in result.errors) { + print(error.toolString); + print(error.span!.highlight(color: true)); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/calculator.dart b/common/combinator/example/calculator.dart new file mode 100644 index 0000000..a667aff --- /dev/null +++ b/common/combinator/example/calculator.dart @@ -0,0 +1,71 @@ +import 'dart:math'; +import 'dart:io'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +/// Note: This grammar does not handle precedence, for the sake of simplicity. +Parser calculatorGrammar() { + var expr = reference(); + + var number = match(RegExp(r'-?[0-9]+(\.[0-9]+)?')) + .value((r) => num.parse(r.span!.text)); + + var hex = match(RegExp(r'0x([A-Fa-f0-9]+)')) + .map((r) => int.parse(r.scanner.lastMatch![1]!, radix: 16)); + + var binary = match(RegExp(r'([0-1]+)b')) + .map((r) => int.parse(r.scanner.lastMatch![1]!, radix: 2)); + + var alternatives = >[]; + + void registerBinary(String op, num Function(num, num) f) { + alternatives.add( + chain([ + expr.space(), + match(op).space() as Parser, + expr.space(), + ]).map((r) => f(r.value![0], r.value![2])), + ); + } + + registerBinary('**', (a, b) => pow(a, b)); + registerBinary('*', (a, b) => a * b); + registerBinary('/', (a, b) => a / b); + registerBinary('%', (a, b) => a % b); + registerBinary('+', (a, b) => a + b); + registerBinary('-', (a, b) => a - b); + registerBinary('^', (a, b) => a.toInt() ^ b.toInt()); + registerBinary('&', (a, b) => a.toInt() & b.toInt()); + registerBinary('|', (a, b) => a.toInt() | b.toInt()); + + alternatives.addAll([ + number, + hex, + binary, + expr.parenthesized(), + ]); + + expr.parser = longest(alternatives); + + return expr; +} + +void main() { + var calculator = calculatorGrammar(); + + while (true) { + stdout.write('Enter an expression: '); + var line = stdin.readLineSync()!; + var scanner = SpanScanner(line, sourceUrl: 'stdin'); + var result = calculator.parse(scanner); + + if (!result.successful) { + for (var error in result.errors) { + stderr.writeln(error.toolString); + stderr.writeln(error.span!.highlight(color: true)); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/delimiter.dart b/common/combinator/example/delimiter.dart new file mode 100644 index 0000000..9baf300 --- /dev/null +++ b/common/combinator/example/delimiter.dart @@ -0,0 +1,29 @@ +import 'dart:io'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +final Parser id = + match(RegExp(r'[A-Za-z]+')).value((r) => r.span!.text); + +// We can use `separatedBy` to easily construct parser +// that can be matched multiple times, separated by another +// pattern. +// +// This is useful for parsing arrays or map literals. +void main() { + while (true) { + stdout.write('Enter a string (ex "a,b,c"): '); + var line = stdin.readLineSync()!; + var scanner = SpanScanner(line, sourceUrl: 'stdin'); + var result = id.separatedBy(match(',').space()).parse(scanner); + + if (!result.successful) { + for (var error in result.errors) { + print(error.toolString); + print(error.span!.highlight(color: true)); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/json.dart b/common/combinator/example/json.dart new file mode 100644 index 0000000..cc679cd --- /dev/null +++ b/common/combinator/example/json.dart @@ -0,0 +1,71 @@ +import 'dart:io'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +Parser jsonGrammar() { + var expr = reference(); + + // Parse a number + var number = match(RegExp(r'-?[0-9]+(\.[0-9]+)?'), + errorMessage: 'Expected a number.') + .value( + (r) => num.parse(r.span!.text), + ); + + // Parse a string (no escapes supported, because lazy). + var string = + match(RegExp(r'"[^"]*"'), errorMessage: 'Expected a string.').value( + (r) => r.span!.text.substring(1, r.span!.text.length - 1), + ); + + // Parse an array + var array = expr + .space() + .separatedByComma() + .surroundedBySquareBrackets(defaultValue: []); + + // KV pair + var keyValuePair = chain([ + string.space(), + match(':').space(), + expr.error(errorMessage: 'Missing expression.'), + ]).castDynamic().cast().value((r) => {r.value![0]: r.value![2]}); + + // Parse an object. + var object = keyValuePair + .separatedByComma() + .castDynamic() + .surroundedByCurlyBraces(defaultValue: {}); + + expr.parser = longest( + [ + array, + number, + string, + object.error(), + ], + errorMessage: 'Expected an expression.', + ).space(); + + return expr.foldErrors(); +} + +void main() { + var json = jsonGrammar(); + + while (true) { + stdout.write('Enter some JSON: '); + var line = stdin.readLineSync()!; + var scanner = SpanScanner(line, sourceUrl: 'stdin'); + var result = json.parse(scanner); + + if (!result.successful) { + for (var error in result.errors) { + print(error.toolString); + print(error.span!.highlight(color: true)); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/main.dart b/common/combinator/example/main.dart new file mode 100644 index 0000000..0435493 --- /dev/null +++ b/common/combinator/example/main.dart @@ -0,0 +1,38 @@ +import 'dart:io'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +final Parser minus = match('-'); + +final Parser digit = + match(RegExp(r'[0-9]'), errorMessage: 'Expected a number'); + +final Parser digits = digit.plus(); + +final Parser dot = match('.'); + +final Parser decimal = ( // digits, (dot, digits)? + digits & (dot & digits).opt() // + ); + +final Parser number = // + (minus.opt() & decimal) // minus?, decimal + .map((r) => num.parse(r.span!.text)); + +void main() { + while (true) { + stdout.write('Enter a number: '); + var line = stdin.readLineSync()!; + var scanner = SpanScanner(line, sourceUrl: 'stdin'); + var result = number.parse(scanner); + + if (!result.successful) { + for (var error in result.errors) { + stderr.writeln(error.toolString); + stderr.writeln(error.span!.highlight(color: true)); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/query_string.dart b/common/combinator/example/query_string.dart new file mode 100644 index 0000000..6b34d90 --- /dev/null +++ b/common/combinator/example/query_string.dart @@ -0,0 +1,45 @@ +// For some reason, this cannot be run in checked mode??? + +import 'dart:io'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +final Parser key = + match(RegExp(r'[^=&\n]+'), errorMessage: 'Missing k/v') + .value((r) => r.span!.text); + +final Parser value = key.map((r) => Uri.decodeQueryComponent(r.value!)); + +final Parser pair = chain([ + key, + match('='), + value, +]).map((r) { + return { + r.value![0]: r.value![2], + }; +}); + +final Parser pairs = pair + .separatedBy(match(r'&')) + .map((r) => r.value!.reduce((a, b) => a..addAll(b))); + +final Parser queryString = pairs.opt(); + +void main() { + while (true) { + stdout.write('Enter a query string: '); + var line = stdin.readLineSync()!; + var scanner = SpanScanner(line, sourceUrl: 'stdin'); + var result = pairs.parse(scanner); + + if (!result.successful) { + for (var error in result.errors) { + print(error.toolString); + print(error.span!.highlight(color: true)); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/sexp.dart b/common/combinator/example/sexp.dart new file mode 100644 index 0000000..9a7014b --- /dev/null +++ b/common/combinator/example/sexp.dart @@ -0,0 +1,85 @@ +import 'dart:collection'; +import 'dart:io'; +import 'dart:math'; +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; +import 'package:tuple/tuple.dart'; + +void main() { + var expr = reference(); + var symbols = {}; + + void registerFunction(String name, int nArgs, Function(List) f) { + symbols[name] = Tuple2(nArgs, f); + } + + registerFunction('**', 2, (args) => pow(args[0], args[1])); + registerFunction('*', 2, (args) => args[0] * args[1]); + registerFunction('/', 2, (args) => args[0] / args[1]); + registerFunction('%', 2, (args) => args[0] % args[1]); + registerFunction('+', 2, (args) => args[0] + args[1]); + registerFunction('-', 2, (args) => args[0] - args[1]); + registerFunction('.', 1, (args) => args[0].toDouble()); + registerFunction('print', 1, (args) { + print(args[0]); + return args[0]; + }); + + var number = + match(RegExp(r'[0-9]+(\.[0-9]+)?'), errorMessage: 'Expected a number.') + .map((r) => num.parse(r.span!.text)); + + var id = match( + RegExp( + r'[A-Za-z_!\\$",\\+-\\./:;\\?<>%&\\*@\[\]\\{\}\\|`\\^~][A-Za-z0-9_!\\$",\\+-\\./:;\\?<>%&\*@\[\]\\{\}\\|`\\^~]*'), + errorMessage: 'Expected an ID') + .map((r) => symbols[r.span!.text] ??= + throw "Undefined symbol: '${r.span!.text}'"); + + var atom = number.castDynamic().or(id); + + var list = expr.space().times(2, exact: false).map((r) { + try { + var out = []; + var q = Queue.from(r.value!.reversed); + + while (q.isNotEmpty) { + var current = q.removeFirst(); + if (current is! Tuple2) { + out.insert(0, current); + } else { + var args = []; + for (var i = 0; i < (current.item1 as num); i++) { + args.add(out.removeLast()); + } + out.add(current.item2(args)); + } + } + + return out.length == 1 ? out.first : out; + } catch (_) { + return []; + } + }); + + expr.parser = longest([ + list, + atom, + expr.parenthesized(), + ]); //list | atom | expr.parenthesized(); + + while (true) { + stdout.write('> '); + var line = stdin.readLineSync()!; + var result = expr.parse(SpanScanner(line)); + + if (result.errors.isNotEmpty) { + for (var error in result.errors) { + print(error.toolString); + print(error.message); + } + } else { + print(result.value); + } + } +} diff --git a/common/combinator/example/tuple.dart b/common/combinator/example/tuple.dart new file mode 100644 index 0000000..bdef872 --- /dev/null +++ b/common/combinator/example/tuple.dart @@ -0,0 +1,14 @@ +import 'package:platform_combinator/combinator.dart'; +import 'package:string_scanner/string_scanner.dart'; + +void main() { + var pub = match('pub').map((r) => r.span!.text).space(); + var dart = match('dart').map((r) => 24).space(); + var lang = match('lang').map((r) => true).space(); + + // Parses a Tuple3 + var grammar = tuple3(pub, dart, lang); + + var scanner = SpanScanner('pub dart lang'); + print(grammar.parse(scanner).value); +} diff --git a/common/combinator/lib/combinator.dart b/common/combinator/lib/combinator.dart new file mode 100644 index 0000000..79e4c07 --- /dev/null +++ b/common/combinator/lib/combinator.dart @@ -0,0 +1,2 @@ +export 'src/combinator/combinator.dart'; +export 'src/error.dart'; diff --git a/common/combinator/lib/src/combinator/advance.dart b/common/combinator/lib/src/combinator/advance.dart new file mode 100644 index 0000000..868df58 --- /dev/null +++ b/common/combinator/lib/src/combinator/advance.dart @@ -0,0 +1,26 @@ +part of 'combinator.dart'; + +class _Advance extends Parser { + final Parser parser; + final int amount; + + _Advance(this.parser, this.amount); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()).change(parser: this); + if (result.successful) args.scanner.position += amount; + return result; + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('advance($amount) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/any.dart b/common/combinator/lib/src/combinator/any.dart new file mode 100644 index 0000000..c5d5b93 --- /dev/null +++ b/common/combinator/lib/src/combinator/any.dart @@ -0,0 +1,85 @@ +part of 'combinator.dart'; + +/// Matches any one of the given [parsers]. +/// +/// If [backtrack] is `true` (default), a failed parse will not modify the scanner state. +/// +/// You can provide a custom [errorMessage]. You can set it to `false` to not +/// generate any error at all. +Parser any(Iterable> parsers, + {bool backtrack = true, errorMessage, SyntaxErrorSeverity? severity}) { + return _Any(parsers, backtrack != false, errorMessage, + severity ?? SyntaxErrorSeverity.error); +} + +class _Any extends Parser { + final Iterable> parsers; + final bool backtrack; + final dynamic errorMessage; + final SyntaxErrorSeverity severity; + + _Any(this.parsers, this.backtrack, this.errorMessage, this.severity); + + @override + ParseResult _parse(ParseArgs args) { + var inactive = parsers + .where((p) => !args.trampoline.isActive(p, args.scanner.position)); + + if (inactive.isEmpty) { + return ParseResult(args.trampoline, args.scanner, this, false, []); + } + + var errors = []; + var replay = args.scanner.position; + + for (var parser in inactive) { + var result = parser._parse(args.increaseDepth()); + + if (result.successful) { + return result; + } else { + if (backtrack) args.scanner.position = replay; + if (parser is _Alt) errors.addAll(result.errors); + } + } + + if (errorMessage != false) { + errors.add( + SyntaxError( + severity, + errorMessage?.toString() ?? + 'No match found for ${parsers.length} alternative(s)', + args.scanner.emptySpan, + ), + ); + } + + return ParseResult(args.trampoline, args.scanner, this, false, errors); + } + + @override + ParseResult __parse(ParseArgs args) { + // Never called + throw ArgumentError('[Combinator] Invalid method call'); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('any(${parsers.length}) (') + ..indent(); + var i = 1; + + for (var parser in parsers) { + buffer + ..writeln('#${i++}:') + ..indent(); + parser.stringify(buffer); + buffer.outdent(); + } + + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/cache.dart b/common/combinator/lib/src/combinator/cache.dart new file mode 100644 index 0000000..268224d --- /dev/null +++ b/common/combinator/lib/src/combinator/cache.dart @@ -0,0 +1,26 @@ +part of 'combinator.dart'; + +class _Cache extends Parser { + final Map> _cache = {}; + final Parser parser; + + _Cache(this.parser); + + @override + ParseResult __parse(ParseArgs args) { + return _cache.putIfAbsent(args.scanner.position, () { + return parser._parse(args.increaseDepth()); + }).change(parser: this); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('cache(${_cache.length}) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/cast.dart b/common/combinator/lib/src/combinator/cast.dart new file mode 100644 index 0000000..eaa9044 --- /dev/null +++ b/common/combinator/lib/src/combinator/cast.dart @@ -0,0 +1,63 @@ +part of 'combinator.dart'; + +class _Cast extends Parser { + final Parser parser; + + _Cast(this.parser); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + return ParseResult( + args.trampoline, + args.scanner, + this, + result.successful, + result.errors, + span: result.span, + value: result.value as U?, + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('cast<$U> (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} + +class _CastDynamic extends Parser { + final Parser parser; + + _CastDynamic(this.parser); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + return ParseResult( + args.trampoline, + args.scanner, + this, + result.successful, + result.errors, + span: result.span, + value: result.value, + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('cast (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/chain.dart b/common/combinator/lib/src/combinator/chain.dart new file mode 100644 index 0000000..4f63a2f --- /dev/null +++ b/common/combinator/lib/src/combinator/chain.dart @@ -0,0 +1,111 @@ +part of 'combinator.dart'; + +/// Expects to parse a sequence of [parsers]. +/// +/// If [failFast] is `true` (default), then the first failure to parse will abort the parse. +ListParser chain(Iterable> parsers, + {bool failFast = true, SyntaxErrorSeverity? severity}) { + return _Chain( + parsers, failFast != false, severity ?? SyntaxErrorSeverity.error); +} + +class _Alt extends Parser { + final Parser parser; + final String? errorMessage; + final SyntaxErrorSeverity severity; + + _Alt(this.parser, this.errorMessage, this.severity); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + return result.successful + ? result + : result.addErrors([ + SyntaxError( + severity, errorMessage, result.span ?? args.scanner.emptySpan), + ]); + } + + @override + void stringify(CodeBuffer buffer) { + parser.stringify(buffer); + } +} + +class _Chain extends ListParser { + final Iterable> parsers; + final bool failFast; + final SyntaxErrorSeverity severity; + + _Chain(this.parsers, this.failFast, this.severity); + + @override + ParseResult> __parse(ParseArgs args) { + var errors = []; + var results = []; + var spans = []; + var successful = true; + + for (var parser in parsers) { + var result = parser._parse(args.increaseDepth()); + + if (!result.successful) { + if (parser is _Alt) errors.addAll(result.errors); + + if (failFast) { + return ParseResult( + args.trampoline, args.scanner, this, false, result.errors); + } + + successful = false; + } + + if (result.value != null) { + results.add(result.value as T); + } else { + results.add('NULL' as T); + } + + if (result.span != null) { + spans.add(result.span!); + } + } + + FileSpan? span; + + if (spans.isNotEmpty) { + span = spans.reduce((a, b) => a.expand(b)); + } + + return ParseResult>( + args.trampoline, + args.scanner, + this, + successful, + errors, + span: span, + value: List.unmodifiable(results), + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('chain(${parsers.length}) (') + ..indent(); + var i = 1; + + for (var parser in parsers) { + buffer + ..writeln('#${i++}:') + ..indent(); + parser.stringify(buffer); + buffer.outdent(); + } + + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/check.dart b/common/combinator/lib/src/combinator/check.dart new file mode 100644 index 0000000..fa575b3 --- /dev/null +++ b/common/combinator/lib/src/combinator/check.dart @@ -0,0 +1,42 @@ +part of 'combinator.dart'; + +class _Check extends Parser { + final Parser parser; + final Matcher matcher; + final String? errorMessage; + final SyntaxErrorSeverity severity; + + _Check(this.parser, this.matcher, this.errorMessage, this.severity); + + @override + ParseResult __parse(ParseArgs args) { + var matchState = {}; + var result = parser._parse(args.increaseDepth()).change(parser: this); + if (!result.successful) { + return result; + } else if (!matcher.matches(result.value, matchState)) { + return result.change(successful: false).addErrors([ + SyntaxError( + severity, + errorMessage ?? + '${matcher.describe(StringDescription('Expected '))}.', + result.span, + ), + ]); + } else { + return result; + } + } + + @override + void stringify(CodeBuffer buffer) { + var d = matcher.describe(StringDescription()); + buffer + ..writeln('check($d) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/combinator.dart b/common/combinator/lib/src/combinator/combinator.dart new file mode 100644 index 0000000..3a0bc52 --- /dev/null +++ b/common/combinator/lib/src/combinator/combinator.dart @@ -0,0 +1,394 @@ +library lex.src.combinator; + +import 'dart:collection'; + +import 'package:platform_code_buffer/code_buffer.dart'; +import 'package:matcher/matcher.dart'; +import 'package:source_span/source_span.dart'; +import 'package:string_scanner/string_scanner.dart'; +import 'package:tuple/tuple.dart'; +import '../error.dart'; + +part 'any.dart'; + +part 'advance.dart'; + +part 'cache.dart'; + +part 'cast.dart'; + +part 'chain.dart'; + +part 'check.dart'; + +part 'compare.dart'; + +part 'fold_errors.dart'; + +part 'index.dart'; + +part 'longest.dart'; + +part 'map.dart'; + +part 'match.dart'; + +part 'max_depth.dart'; + +part 'negate.dart'; + +part 'opt.dart'; + +part 'recursion.dart'; + +part 'reduce.dart'; + +part 'reference.dart'; + +part 'repeat.dart'; + +part 'safe.dart'; + +part 'to_list.dart'; + +part 'util.dart'; + +part 'value.dart'; + +class ParseArgs { + final Trampoline trampoline; + final SpanScanner scanner; + final int depth; + + ParseArgs(this.trampoline, this.scanner, this.depth); + + ParseArgs increaseDepth() => ParseArgs(trampoline, scanner, depth + 1); +} + +/// A parser combinator, which can parse very complicated grammars in a manageable manner. +abstract class Parser { + ParseResult __parse(ParseArgs args); + + ParseResult _parse(ParseArgs args) { + var pos = args.scanner.position; + + if (args.trampoline.hasMemoized(this, pos)) { + return args.trampoline.getMemoized(this, pos); + } + + if (args.trampoline.isActive(this, pos)) { + return ParseResult(args.trampoline, args.scanner, this, false, []); + } + + args.trampoline.enter(this, pos); + var result = __parse(args); + args.trampoline.memoize(this, pos, result); + args.trampoline.exit(this); + return result; + } + + /// Parses text from a [SpanScanner]. + ParseResult parse(SpanScanner scanner, [int depth = 1]) { + var args = ParseArgs(Trampoline(), scanner, depth); + return _parse(args); + } + + /// Skips forward a certain amount of steps after parsing, if it was successful. + Parser forward(int amount) => _Advance(this, amount); + + /// Moves backward a certain amount of steps after parsing, if it was successful. + Parser back(int amount) => _Advance(this, amount * -1); + + /// Casts this parser to produce [U] objects. + Parser cast() => _Cast(this); + + /// Casts this parser to produce [dynamic] objects. + Parser castDynamic() => _CastDynamic(this); + + /// Runs the given function, which changes the returned [ParseResult] into one relating to a [U] object. + Parser change(ParseResult Function(ParseResult) f) { + return _Change(this, f); + } + + /// Validates the parse result against a [Matcher]. + /// + /// You can provide a custom [errorMessage]. + Parser check(Matcher matcher, + {String? errorMessage, SyntaxErrorSeverity? severity}) => + _Check( + this, matcher, errorMessage, severity ?? SyntaxErrorSeverity.error); + + /// Binds an [errorMessage] to a copy of this parser. + Parser error({String? errorMessage, SyntaxErrorSeverity? severity}) => + _Alt(this, errorMessage, severity ?? SyntaxErrorSeverity.error); + + /// Removes multiple errors that occur in the same spot; this can reduce noise in parser output. + Parser foldErrors({bool Function(SyntaxError a, SyntaxError b)? equal}) { + equal ??= (b, e) => b.span?.start.offset == e.span?.start.offset; + return _FoldErrors(this, equal); + } + + /// Transforms the parse result using a unary function. + Parser map(U Function(ParseResult) f) { + return _Map(this, f); + } + + /// Prevents recursion past a certain [depth], preventing stack overflow errors. + Parser maxDepth(int depth) => _MaxDepth(this, depth); + + Parser operator ~() => negate(); + + /// Ensures this pattern is not matched. + /// + /// You can provide an [errorMessage]. + Parser negate( + {String errorMessage = 'Negate error', + SyntaxErrorSeverity severity = SyntaxErrorSeverity.error}) => + _Negate(this, errorMessage, severity); + + /// Caches the results of parse attempts at various locations within the source text. + /// + /// Use this to prevent excessive recursion. + Parser cache() => _Cache(this); + + Parser operator &(Parser other) => and(other); + + /// Consumes `this` and another parser, but only considers the result of `this` parser. + Parser and(Parser other) => then(other).change((r) { + return ParseResult( + r.trampoline, + r.scanner, + this, + r.successful, + r.errors, + span: r.span, + value: (r.value != null ? r.value![0] : r.value) as T?, + ); + }); + + Parser operator |(Parser other) => or(other); + + /// Shortcut for [or]-ing two parsers. + Parser or(Parser other) => any([this, other]); + + /// Parses this sequence one or more times. + ListParser plus() => times(1, exact: false); + + /// Safely escapes this parser when an error occurs. + /// + /// The generated parser only runs once; repeated uses always exit eagerly. + Parser safe( + {bool backtrack = true, + String errorMessage = 'error', + SyntaxErrorSeverity? severity}) => + _Safe( + this, backtrack, errorMessage, severity ?? SyntaxErrorSeverity.error); + + Parser> separatedByComma() => + separatedBy(match>(',').space()); + + /// Expects to see an infinite amounts of the pattern, separated by the [other] pattern. + /// + /// Use this as a shortcut to parse arrays, parameter lists, etc. + Parser> separatedBy(Parser other) { + var suffix = other.then(this).index(1).cast(); + return then(suffix.star()).map((r) { + var v = r.value; + if (v == null || v.length < 2) { + return []; + } + var preceding = v.isEmpty ? [] : (v[0] == null ? [] : [v[0]]); + var out = List.from(preceding); + if (v[1] != null && v[1] != 'NULL') { + v[1].forEach((element) { + out.add(element as T); + }); + } + return out; + }); + } + + Parser surroundedByCurlyBraces({required T defaultValue}) => opt() + .surroundedBy(match('{').space(), match('}').space()) + .map((r) => r.value ?? defaultValue); + + Parser surroundedBySquareBrackets({required T defaultValue}) => opt() + .surroundedBy(match('[').space(), match(']').space()) + .map((r) => r.value ?? defaultValue); + + /// Expects to see the pattern, surrounded by the others. + /// + /// If no [right] is provided, it expects to see the same pattern on both sides. + /// Use this parse things like parenthesized expressions, arrays, etc. + Parser surroundedBy(Parser left, [Parser? right]) { + return chain([ + left, + this, + right ?? left, + ]).index(1).castDynamic().cast(); + } + + /// Parses `this`, either as-is or wrapped in parentheses. + Parser maybeParenthesized() { + return any([parenthesized(), this]); + } + + /// Parses `this`, wrapped in parentheses. + Parser parenthesized() => + surroundedBy(match('(').space(), match(')').space()); + + /// Consumes any trailing whitespace. + Parser space() => trail(RegExp(r'[ \n\r\t]+')); + + /// Consumes 0 or more instance(s) of this parser. + ListParser star({bool backtrack = true}) => + times(1, exact: false, backtrack: backtrack).opt(); + + /// Shortcut for [chain]-ing two parsers together. + ListParser then(Parser other) => chain([this, other]); + + /// Casts this instance into a [ListParser]. + ListParser toList() => _ToList(this); + + /// Consumes and ignores any trailing occurrences of [pattern]. + Parser trail(Pattern pattern) => + then(match(pattern).opt()).first().cast(); + + /// Expect this pattern a certain number of times. + /// + /// If [exact] is `false` (default: `true`), then the generated parser will accept + /// an infinite amount of occurrences after the specified [count]. + /// + /// You can provide custom error messages for when there are [tooFew] or [tooMany] occurrences. + ListParser times(int count, + {bool exact = true, + String tooFew = 'Too few', + String tooMany = 'Too many', + bool backtrack = true, + SyntaxErrorSeverity? severity}) { + return _Repeat(this, count, exact, tooFew, tooMany, backtrack, + severity ?? SyntaxErrorSeverity.error); + } + + /// Produces an optional copy of this parser. + /// + /// If [backtrack] is `true` (default), then a failed parse will not + /// modify the scanner state. + Parser opt({bool backtrack = true}) => _Opt(this, backtrack); + + /// Sets the value of the [ParseResult]. + Parser value(T Function(ParseResult) f) { + return _Value(this, f); + } + + /// Prints a representation of this parser, ideally without causing a stack overflow. + void stringify(CodeBuffer buffer); +} + +/// A [Parser] that produces [List]s of a type [T]. +abstract class ListParser extends Parser> { + /// Shortcut for calling [index] with `0`. + Parser first() => index(0); + + /// Modifies this parser to only return the value at the given index [i]. + Parser index(int i) => _Index(this, i); + + /// Shortcut for calling [index] with the greatest-possible index. + Parser last() => index(-1); + + /// Modifies this parser to call `List.reduce` on the parsed values. + Parser reduce(T Function(T, T) combine) => _Reduce(this, combine); + + /// Sorts the parsed values, using the given [Comparator]. + ListParser sort(Comparator compare) => _Compare(this, compare); + + @override + ListParser opt({bool backtrack = true}) => _ListOpt(this, backtrack); + + /// Modifies this parser, returning only the values that match a predicate. + Parser> where(bool Function(T) f) => + map>((r) => r.value?.where(f).toList() ?? []); + + /// Condenses a [ListParser] into having a value of the combined span's text. + Parser flatten() => map((r) => r.span?.text ?? ''); +} + +/// Prevents stack overflow in recursive parsers. +class Trampoline { + final Map> _active = {}; + final Map>> _memo = {}; + + bool hasMemoized(Parser parser, int position) { + var list = _memo[parser]; + return list?.any((t) => t.item1 == position) == true; + } + + ParseResult getMemoized(Parser parser, int position) { + return _memo[parser]?.firstWhere((t) => t.item1 == position).item2 + as ParseResult; + } + + void memoize(Parser parser, int position, ParseResult? result) { + if (result != null) { + var list = _memo.putIfAbsent(parser, () => []); + var tuple = Tuple2(position, result); + if (!list.contains(tuple)) list.add(tuple); + } + } + + bool isActive(Parser parser, int position) { + if (!_active.containsKey(parser)) { + return false; + } + var q = _active[parser]!; + if (q.isEmpty) return false; + //return q.contains(position); + return q.first == position; + } + + void enter(Parser parser, int position) { + _active.putIfAbsent(parser, () => Queue()).addFirst(position); + } + + void exit(Parser parser) { + if (_active.containsKey(parser)) _active[parser]?.removeFirst(); + } +} + +/// The result generated by a [Parser]. +class ParseResult { + final Parser parser; + final bool successful; + final Iterable errors; + final FileSpan? span; + final T? value; + final SpanScanner scanner; + final Trampoline trampoline; + + ParseResult( + this.trampoline, this.scanner, this.parser, this.successful, this.errors, + {this.span, this.value}); + + ParseResult change( + {Parser? parser, + bool? successful, + Iterable errors = const [], + FileSpan? span, + T? value}) { + return ParseResult( + trampoline, + scanner, + parser ?? this.parser, + successful ?? this.successful, + errors.isNotEmpty ? errors : this.errors, + span: span ?? this.span, + value: value ?? this.value, + ); + } + + ParseResult addErrors(Iterable errors) { + return change( + errors: List.from(this.errors)..addAll(errors), + ); + } +} diff --git a/common/combinator/lib/src/combinator/compare.dart b/common/combinator/lib/src/combinator/compare.dart new file mode 100644 index 0000000..1448569 --- /dev/null +++ b/common/combinator/lib/src/combinator/compare.dart @@ -0,0 +1,38 @@ +part of 'combinator.dart'; + +class _Compare extends ListParser { + final ListParser parser; + final Comparator compare; + + _Compare(this.parser, this.compare); + + @override + ParseResult> __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + if (!result.successful) return result; + + result = result.change( + value: result.value?.isNotEmpty == true ? result.value : []); + result = result.change(value: List.from(result.value!)); + return ParseResult>( + args.trampoline, + args.scanner, + this, + true, + [], + span: result.span, + value: result.value?..sort(compare), + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('sort($compare) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/fold_errors.dart b/common/combinator/lib/src/combinator/fold_errors.dart new file mode 100644 index 0000000..05f95f5 --- /dev/null +++ b/common/combinator/lib/src/combinator/fold_errors.dart @@ -0,0 +1,29 @@ +part of 'combinator.dart'; + +class _FoldErrors extends Parser { + final Parser parser; + final bool Function(SyntaxError, SyntaxError) equal; + + _FoldErrors(this.parser, this.equal); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()).change(parser: this); + var errors = result.errors.fold>([], (out, e) { + if (!out.any((b) => equal(e, b))) out.add(e); + return out; + }); + return result.change(errors: errors); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('fold errors (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/index.dart b/common/combinator/lib/src/combinator/index.dart new file mode 100644 index 0000000..1a0d114 --- /dev/null +++ b/common/combinator/lib/src/combinator/index.dart @@ -0,0 +1,52 @@ +part of 'combinator.dart'; + +class _Index extends Parser { + final ListParser parser; + final int index; + + _Index(this.parser, this.index); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + Object? value; + + if (result.successful) { + var vList = result.value; + if (vList == null) { + throw ArgumentError('ParseResult is null'); + } + if (index == -1) { + value = vList.last; + } else { + if (index < vList.length) { +// print(">>>>Index: $index, Size: ${vList.length}"); +// value = +// index == -1 ? result.value!.last : result.value!.elementAt(index); + value = result.value!.elementAt(index); + } + } + } + + return ParseResult( + args.trampoline, + args.scanner, + this, + result.successful, + result.errors, + span: result.span, + value: value as T?, + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('index($index) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/longest.dart b/common/combinator/lib/src/combinator/longest.dart new file mode 100644 index 0000000..ff9fe71 --- /dev/null +++ b/common/combinator/lib/src/combinator/longest.dart @@ -0,0 +1,119 @@ +part of 'combinator.dart'; + +/// Matches any one of the given [parsers]. +/// +/// You can provide a custom [errorMessage]. +Parser longest(Iterable> parsers, + {Object? errorMessage, SyntaxErrorSeverity? severity}) { + return _Longest(parsers, errorMessage, severity ?? SyntaxErrorSeverity.error); +} + +class _Longest extends Parser { + final Iterable> parsers; + final Object? errorMessage; + final SyntaxErrorSeverity severity; + + _Longest(this.parsers, this.errorMessage, this.severity); + + @override + ParseResult _parse(ParseArgs args) { + var inactive = parsers + .toList() + .where((p) => !args.trampoline.isActive(p, args.scanner.position)); + + if (inactive.isEmpty) { + return ParseResult(args.trampoline, args.scanner, this, false, []); + } + + var replay = args.scanner.position; + var errors = []; + var results = >[]; + + for (var parser in inactive) { + var result = parser._parse(args.increaseDepth()); + + if (result.successful && result.span != null) { + results.add(result); + } else if (parser is _Alt) { + errors.addAll(result.errors); + } + + args.scanner.position = replay; + } + + if (results.isNotEmpty) { + results.sort((a, b) => b.span!.length.compareTo(a.span!.length)); + args.scanner.scan(results.first.span!.text); + return results.first; + } + + if (errorMessage != false) { + errors.add( + SyntaxError( + severity, + errorMessage?.toString() ?? + 'No match found for ${parsers.length} alternative(s)', + args.scanner.emptySpan, + ), + ); + } + + return ParseResult(args.trampoline, args.scanner, this, false, errors); + } + + @override + ParseResult __parse(ParseArgs args) { + var replay = args.scanner.position; + var errors = []; + var results = >[]; + + for (var parser in parsers) { + var result = parser._parse(args.increaseDepth()); + + if (result.successful) { + results.add(result); + } else if (parser is _Alt) { + errors.addAll(result.errors); + } + + args.scanner.position = replay; + } + + if (results.isNotEmpty) { + results.sort((a, b) => b.span!.length.compareTo(a.span!.length)); + args.scanner.scan(results.first.span!.text); + return results.first; + } + + errors.add( + SyntaxError( + severity, + errorMessage?.toString() ?? + 'No match found for ${parsers.length} alternative(s)', + args.scanner.emptySpan, + ), + ); + + return ParseResult(args.trampoline, args.scanner, this, false, errors); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('longest(${parsers.length}) (') + ..indent(); + var i = 1; + + for (var parser in parsers) { + buffer + ..writeln('#${i++}:') + ..indent(); + parser.stringify(buffer); + buffer.outdent(); + } + + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/map.dart b/common/combinator/lib/src/combinator/map.dart new file mode 100644 index 0000000..f0ec10b --- /dev/null +++ b/common/combinator/lib/src/combinator/map.dart @@ -0,0 +1,56 @@ +part of 'combinator.dart'; + +class _Map extends Parser { + final Parser parser; + final U Function(ParseResult) f; + + _Map(this.parser, this.f); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + return ParseResult( + args.trampoline, + args.scanner, + this, + result.successful, + result.errors, + span: result.span, + value: result.successful ? f(result) : null, + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('map<$U> (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} + +class _Change extends Parser { + final Parser parser; + final ParseResult Function(ParseResult) f; + + _Change(this.parser, this.f); + + @override + ParseResult __parse(ParseArgs args) { + return f(parser._parse(args.increaseDepth())).change(parser: this); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('change($f) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/match.dart b/common/combinator/lib/src/combinator/match.dart new file mode 100644 index 0000000..10c65b9 --- /dev/null +++ b/common/combinator/lib/src/combinator/match.dart @@ -0,0 +1,41 @@ +part of 'combinator.dart'; + +/// Expects to match a given [pattern]. If it is not matched, you can provide a custom [errorMessage]. +Parser match(Pattern pattern, + {String? errorMessage, SyntaxErrorSeverity? severity}) => + _Match(pattern, errorMessage, severity ?? SyntaxErrorSeverity.error); + +class _Match extends Parser { + final Pattern pattern; + final String? errorMessage; + final SyntaxErrorSeverity severity; + + _Match(this.pattern, this.errorMessage, this.severity); + + @override + ParseResult __parse(ParseArgs args) { + var scanner = args.scanner; + if (!scanner.scan(pattern)) { + return ParseResult(args.trampoline, scanner, this, false, [ + SyntaxError( + severity, + errorMessage ?? 'Expected "$pattern".', + scanner.emptySpan, + ), + ]); + } + return ParseResult( + args.trampoline, + scanner, + this, + true, + [], + span: scanner.lastSpan, + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer.writeln('match($pattern)'); + } +} diff --git a/common/combinator/lib/src/combinator/max_depth.dart b/common/combinator/lib/src/combinator/max_depth.dart new file mode 100644 index 0000000..3fdf22e --- /dev/null +++ b/common/combinator/lib/src/combinator/max_depth.dart @@ -0,0 +1,28 @@ +part of 'combinator.dart'; + +class _MaxDepth extends Parser { + final Parser parser; + final int cap; + + _MaxDepth(this.parser, this.cap); + + @override + ParseResult __parse(ParseArgs args) { + if (args.depth > cap) { + return ParseResult(args.trampoline, args.scanner, this, false, []); + } + + return parser._parse(args.increaseDepth()); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('max depth($cap) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/negate.dart b/common/combinator/lib/src/combinator/negate.dart new file mode 100644 index 0000000..4c6119c --- /dev/null +++ b/common/combinator/lib/src/combinator/negate.dart @@ -0,0 +1,51 @@ +part of 'combinator.dart'; + +class _Negate extends Parser { + final Parser parser; + final String? errorMessage; + final SyntaxErrorSeverity severity; + + _Negate(this.parser, this.errorMessage, this.severity); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()).change(parser: this); + + if (!result.successful) { + return ParseResult( + args.trampoline, + args.scanner, + this, + true, + [], + span: result.span ?? args.scanner.lastSpan ?? args.scanner.emptySpan, + value: result.value, + ); + } + + result = result.change(successful: false); + + if (errorMessage != null) { + result = result.addErrors([ + SyntaxError( + severity, + errorMessage, + result.span, + ), + ]); + } + + return result; + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('negate (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/opt.dart b/common/combinator/lib/src/combinator/opt.dart new file mode 100644 index 0000000..8d1b43c --- /dev/null +++ b/common/combinator/lib/src/combinator/opt.dart @@ -0,0 +1,57 @@ +part of 'combinator.dart'; + +class _Opt extends Parser { + final Parser parser; + final bool backtrack; + + _Opt(this.parser, this.backtrack); + + @override + ParseResult __parse(ParseArgs args) { + var replay = args.scanner.position; + var result = parser._parse(args.increaseDepth()); + + if (!result.successful) args.scanner.position = replay; + + return result.change(parser: this, successful: true); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('optional (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} + +class _ListOpt extends ListParser { + final ListParser parser; + final bool backtrack; + + _ListOpt(this.parser, this.backtrack); + + @override + ParseResult> __parse(ParseArgs args) { + var replay = args.scanner.position; + var result = parser._parse(args.increaseDepth()); + + if (!result.successful) args.scanner.position = replay; + + return result.change(parser: this, successful: true); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('optional (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/recursion.dart b/common/combinator/lib/src/combinator/recursion.dart new file mode 100644 index 0000000..b23e784 --- /dev/null +++ b/common/combinator/lib/src/combinator/recursion.dart @@ -0,0 +1,142 @@ +part of 'combinator.dart'; + +/* +/// Handles left recursion in a grammar using the Pratt algorithm. +class Recursion { + Iterable> prefix; + Map)> infix; + Map)> postfix; + + Recursion({this.prefix, this.infix, this.postfix}) { + prefix ??= []; + infix ??= {}; + postfix ??= {}; + } + + Parser precedence(int p) => _Precedence(this, p); + + void stringify(CodeBuffer buffer) { + buffer + ..writeln('recursion (') + ..indent() + ..writeln('prefix(${prefix.length}') + ..writeln('infix(${infix.length}') + ..writeln('postfix(${postfix.length}') + ..outdent() + ..writeln(')'); + } +} + +class _Precedence extends Parser { + final Recursion r; + final int precedence; + + _Precedence(this.r, this.precedence); + + @override + ParseResult __parse(ParseArgs args) { + int replay = args.scanner.position; + var errors = []; + var start = args.scanner.state; + var reversedKeys = r.infix.keys.toList().reversed; + + for (var pre in r.prefix) { + var result = pre._parse(args.increaseDepth()), originalResult = result; + + if (!result.successful) { + if (pre is _Alt) errors.addAll(result.errors); + args.scanner.position = replay; + } else { + var left = result.value; + replay = args.scanner.position; + //print('${result.span.text}:\n' + scanner.emptySpan.highlight()); + + while (true) { + bool matched = false; + + //for (int i = 0; i < r.infix.length; i++) { + for (int i = r.infix.length - 1; i >= 0; i--) { + //var fix = r.infix.keys.elementAt(r.infix.length - i - 1); + var fix = reversedKeys.elementAt(i); + + if (i < precedence) continue; + + var result = fix._parse(args.increaseDepth()); + + if (!result.successful) { + if (fix is _Alt) errors.addAll(result.errors); + // If this is the last alternative and it failed, don't continue looping. + //if (true || i + 1 < r.infix.length) + args.scanner.position = replay; + } else { + //print('FOUND $fix when left was $left'); + //print('$i vs $precedence\n${originalResult.span.highlight()}'); + result = r.precedence(i)._parse(args.increaseDepth()); + + if (!result.successful) { + } else { + matched = false; + var old = left; + left = r.infix[fix](left, result.value, result); + print( + '$old $fix ${result.value} = $left\n${result.span.highlight()}'); + break; + } + } + } + + if (!matched) break; + } + + replay = args.scanner.position; + //print('f ${result.span.text}'); + + for (var post in r.postfix.keys) { + var result = pre._parse(args.increaseDepth()); + + if (!result.successful) { + if (post is _Alt) errors.addAll(result.errors); + args.scanner.position = replay; + } else { + left = r.infix[post](left, originalResult.value, result); + } + } + + if (!args.scanner.isDone) { + // If we're not done scanning, then we need some sort of guard to ensure the + // that this exact parser does not run again in the exact position. + } + return ParseResult( + args.trampoline, + args.scanner, + this, + true, + errors, + value: left, + span: args.scanner.spanFrom(start), + ); + } + } + + return ParseResult( + args.trampoline, + args.scanner, + this, + false, + errors, + span: args.scanner.spanFrom(start), + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('precedence($precedence) (') + ..indent(); + r.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} +*/ diff --git a/common/combinator/lib/src/combinator/reduce.dart b/common/combinator/lib/src/combinator/reduce.dart new file mode 100644 index 0000000..c4db631 --- /dev/null +++ b/common/combinator/lib/src/combinator/reduce.dart @@ -0,0 +1,46 @@ +part of 'combinator.dart'; + +class _Reduce extends Parser { + final ListParser parser; + final T Function(T, T) combine; + + _Reduce(this.parser, this.combine); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + + if (!result.successful) { + return ParseResult( + args.trampoline, + args.scanner, + this, + false, + result.errors, + ); + } + + result = result.change( + value: result.value?.isNotEmpty == true ? result.value : []); + return ParseResult( + args.trampoline, + args.scanner, + this, + result.successful, + [], + span: result.span, + value: result.value!.isEmpty ? null : result.value!.reduce(combine), + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('reduce($combine) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/reference.dart b/common/combinator/lib/src/combinator/reference.dart new file mode 100644 index 0000000..20bc524 --- /dev/null +++ b/common/combinator/lib/src/combinator/reference.dart @@ -0,0 +1,44 @@ +part of 'combinator.dart'; + +Reference reference() => Reference._(); + +class Reference extends Parser { + Parser? _parser; + bool printed = false; + + Reference._(); + + set parser(Parser value) { + if (_parser != null) { + throw StateError('There is already a parser assigned to this reference.'); + } + _parser = value; + } + + @override + ParseResult __parse(ParseArgs args) { + if (_parser == null) { + throw StateError('There is no parser assigned to this reference.'); + } + return _parser!._parse(args); + } + + @override + ParseResult _parse(ParseArgs args) { + if (_parser == null) { + throw StateError('There is no parser assigned to this reference.'); + } + return _parser!._parse(args); + } + + @override + void stringify(CodeBuffer buffer) { + if (_parser == null) { + buffer.writeln('(undefined reference <$T>)'); + } else if (!printed) { + _parser!.stringify(buffer); + } + printed = true; + buffer.writeln('(previously printed reference)'); + } +} diff --git a/common/combinator/lib/src/combinator/repeat.dart b/common/combinator/lib/src/combinator/repeat.dart new file mode 100644 index 0000000..debf118 --- /dev/null +++ b/common/combinator/lib/src/combinator/repeat.dart @@ -0,0 +1,91 @@ +part of 'combinator.dart'; + +class _Repeat extends ListParser { + final Parser parser; + final int count; + final bool exact, backtrack; + final String tooFew; + final String tooMany; + final SyntaxErrorSeverity severity; + + _Repeat(this.parser, this.count, this.exact, this.tooFew, this.tooMany, + this.backtrack, this.severity); + + @override + ParseResult> __parse(ParseArgs args) { + var errors = []; + var results = []; + var spans = []; + var success = 0; + var replay = args.scanner.position; + ParseResult result; + + do { + result = parser._parse(args.increaseDepth()); + if (result.successful) { + success++; + if (result.value != null) { + results.add(result.value as T); + } + replay = args.scanner.position; + } else if (backtrack) { + args.scanner.position = replay; + } + + if (result.span != null) { + spans.add(result.span!); + } + } while (result.successful); + + if (success < count) { + errors.addAll(result.errors); + errors.add( + SyntaxError( + severity, + tooFew, + result.span ?? args.scanner.emptySpan, + ), + ); + + if (backtrack) args.scanner.position = replay; + + return ParseResult>( + args.trampoline, args.scanner, this, false, errors); + } else if (success > count && exact) { + if (backtrack) args.scanner.position = replay; + + return ParseResult>(args.trampoline, args.scanner, this, false, [ + SyntaxError( + severity, + tooMany, + result.span ?? args.scanner.emptySpan, + ), + ]); + } + + var span = spans.reduce((a, b) => a.expand(b)); + return ParseResult>( + args.trampoline, + args.scanner, + this, + true, + [], + span: span, + value: results, + ); + } + + @override + void stringify(CodeBuffer buffer) { + var r = StringBuffer('{$count'); + if (!exact) r.write(','); + r.write('}'); + buffer + ..writeln('repeat($r) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/safe.dart b/common/combinator/lib/src/combinator/safe.dart new file mode 100644 index 0000000..86c12fb --- /dev/null +++ b/common/combinator/lib/src/combinator/safe.dart @@ -0,0 +1,47 @@ +part of 'combinator.dart'; + +class _Safe extends Parser { + final Parser parser; + final bool backtrack; + final String errorMessage; + final SyntaxErrorSeverity severity; + bool _triggered = false; + + _Safe(this.parser, this.backtrack, this.errorMessage, this.severity); + + @override + ParseResult __parse(ParseArgs args) { + var replay = args.scanner.position; + + try { + if (_triggered) throw Exception(); + return parser._parse(args.increaseDepth()); + } catch (_) { + _triggered = true; + if (backtrack) args.scanner.position = replay; + var errors = []; + + errors.add( + SyntaxError( + severity, + errorMessage, + args.scanner.lastSpan ?? args.scanner.emptySpan, + ), + ); + + return ParseResult(args.trampoline, args.scanner, this, false, errors); + } + } + + @override + void stringify(CodeBuffer buffer) { + var t = _triggered ? 'triggered' : 'not triggered'; + buffer + ..writeln('safe($t) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/to_list.dart b/common/combinator/lib/src/combinator/to_list.dart new file mode 100644 index 0000000..f838ceb --- /dev/null +++ b/common/combinator/lib/src/combinator/to_list.dart @@ -0,0 +1,41 @@ +part of 'combinator.dart'; + +class _ToList extends ListParser { + final Parser parser; + + _ToList(this.parser); + + @override + ParseResult> __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()); + + if (result.value is List) { + return (result as ParseResult>).change(parser: this); + } + + var values = []; + if (result.value != null) { + values.add(result.value as T); + } + return ParseResult( + args.trampoline, + args.scanner, + this, + result.successful, + result.errors, + span: result.span, + value: values, + ); + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('to list (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/combinator/util.dart b/common/combinator/lib/src/combinator/util.dart new file mode 100644 index 0000000..69df180 --- /dev/null +++ b/common/combinator/lib/src/combinator/util.dart @@ -0,0 +1,57 @@ +part of 'combinator.dart'; + +/// A typed parser that parses a sequence of 2 values of different types. +Parser> tuple2(Parser a, Parser b) { + return chain([a, b]).map((r) { + return Tuple2(r.value?[0] as A, r.value?[1] as B); + }); +} + +/// A typed parser that parses a sequence of 3 values of different types. +Parser> tuple3(Parser a, Parser b, Parser c) { + return chain([a, b, c]).map((r) { + return Tuple3(r.value?[0] as A, r.value?[1] as B, r.value?[2] as C); + }); +} + +/// A typed parser that parses a sequence of 4 values of different types. +Parser> tuple4( + Parser a, Parser b, Parser c, Parser d) { + return chain([a, b, c, d]).map((r) { + return Tuple4( + r.value?[0] as A, r.value?[1] as B, r.value?[2] as C, r.value?[3] as D); + }); +} + +/// A typed parser that parses a sequence of 5 values of different types. +Parser> tuple5( + Parser a, Parser b, Parser c, Parser d, Parser e) { + return chain([a, b, c, d, e]).map((r) { + return Tuple5(r.value?[0] as A, r.value?[1] as B, r.value?[2] as C, + r.value?[3] as D, r.value?[4] as E); + }); +} + +/// A typed parser that parses a sequence of 6 values of different types. +Parser> tuple6(Parser a, + Parser b, Parser c, Parser d, Parser e, Parser f) { + return chain([a, b, c, d, e, f]).map((r) { + return Tuple6(r.value?[0] as A, r.value?[1] as B, r.value?[2] as C, + r.value?[3] as D, r.value?[4] as E, r.value?[5] as F); + }); +} + +/// A typed parser that parses a sequence of 7 values of different types. +Parser> tuple7( + Parser a, + Parser b, + Parser c, + Parser d, + Parser e, + Parser f, + Parser g) { + return chain([a, b, c, d, e, f, g]).map((r) { + return Tuple7(r.value?[0] as A, r.value?[1] as B, r.value?[2] as C, + r.value?[3] as D, r.value?[4] as E, r.value?[5] as F, r.value?[6] as G); + }); +} diff --git a/common/combinator/lib/src/combinator/value.dart b/common/combinator/lib/src/combinator/value.dart new file mode 100644 index 0000000..ec47cc7 --- /dev/null +++ b/common/combinator/lib/src/combinator/value.dart @@ -0,0 +1,25 @@ +part of 'combinator.dart'; + +class _Value extends Parser { + final Parser parser; + final T Function(ParseResult) f; + + _Value(this.parser, this.f); + + @override + ParseResult __parse(ParseArgs args) { + var result = parser._parse(args.increaseDepth()).change(parser: this); + return result.successful ? result.change(value: f(result)) : result; + } + + @override + void stringify(CodeBuffer buffer) { + buffer + ..writeln('set value($f) (') + ..indent(); + parser.stringify(buffer); + buffer + ..outdent() + ..writeln(')'); + } +} diff --git a/common/combinator/lib/src/error.dart b/common/combinator/lib/src/error.dart new file mode 100644 index 0000000..6c7b5f5 --- /dev/null +++ b/common/combinator/lib/src/error.dart @@ -0,0 +1,23 @@ +import 'package:source_span/source_span.dart'; + +class SyntaxError implements Exception { + final SyntaxErrorSeverity severity; + final String? message; + final FileSpan? span; + String? _toolString; + + SyntaxError(this.severity, this.message, this.span); + + String? get toolString { + if (_toolString != null) return _toolString; + var type = severity == SyntaxErrorSeverity.warning ? 'warning' : 'error'; + return _toolString = '$type: ${span!.start.toolString}: $message'; + } +} + +enum SyntaxErrorSeverity { + warning, + error, + info, + hint, +} diff --git a/common/combinator/pubspec.lock b/common/combinator/pubspec.lock new file mode 100644 index 0000000..d5565ea --- /dev/null +++ b/common/combinator/pubspec.lock @@ -0,0 +1,425 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + charcode: + dependency: transitive + description: + name: charcode + sha256: fb0f1107cac15a5ea6ef0a6ef71a807b9e4267c713bb93e00e92d737cc8dbd8a + url: "https://pub.dev" + source: hosted + version: "1.4.0" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: "direct main" + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + platform_code_buffer: + dependency: "direct main" + description: + path: "../code_buffer" + relative: true + source: path + version: "5.2.0" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: "direct main" + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: "direct main" + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + tuple: + dependency: "direct main" + description: + name: tuple + sha256: a97ce2013f240b2f3807bcbaf218765b6f301c3eff91092bcfa23a039e7dd151 + url: "https://pub.dev" + source: hosted + version: "2.0.2" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/combinator/pubspec.yaml b/common/combinator/pubspec.yaml new file mode 100644 index 0000000..36b4176 --- /dev/null +++ b/common/combinator/pubspec.yaml @@ -0,0 +1,15 @@ +name: platform_combinator +version: 5.2.0 +description: Packrat parser combinators that support static typing, generics, file spans, memoization, and more. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/combinator +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + platform_code_buffer: ^5.0.0 + matcher: ^0.12.10 + source_span: ^1.8.1 + string_scanner: ^1.1.0 + tuple: ^2.0.0 +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 \ No newline at end of file diff --git a/common/combinator/test/all.dart b/common/combinator/test/all.dart new file mode 100644 index 0000000..9022d5f --- /dev/null +++ b/common/combinator/test/all.dart @@ -0,0 +1,12 @@ +import 'package:test/test.dart'; +import 'list_test.dart' as list; +import 'match_test.dart' as match; +import 'misc_test.dart' as misc; +import 'value_test.dart' as value; + +void main() { + group('list', list.main); + group('match', match.main); + group('value', value.main); + misc.main(); +} diff --git a/common/combinator/test/common.dart b/common/combinator/test/common.dart new file mode 100644 index 0000000..9ccc542 --- /dev/null +++ b/common/combinator/test/common.dart @@ -0,0 +1,3 @@ +import 'package:string_scanner/string_scanner.dart'; + +SpanScanner scan(String text) => SpanScanner(text); diff --git a/common/combinator/test/list_test.dart b/common/combinator/test/list_test.dart new file mode 100644 index 0000000..96348c9 --- /dev/null +++ b/common/combinator/test/list_test.dart @@ -0,0 +1,22 @@ +import 'package:platform_combinator/combinator.dart'; +import 'package:test/test.dart'; +import 'common.dart'; + +void main() { + var number = chain([ + match(RegExp(r'[0-9]+')).value((r) => int.parse(r.span!.text)), + match(',').opt(), + ]).first().cast(); + + var numbers = number.plus(); + + test('sort', () { + var parser = numbers.sort((a, b) => a.compareTo(b)); + expect(parser.parse(scan('21,2,3,34,20')).value, [2, 3, 20, 21, 34]); + }); + test('reduce', () { + var parser = numbers.reduce((a, b) => a + b); + expect(parser.parse(scan('21,2,3,34,20')).value, 80); + expect(parser.parse(scan('not numbers')).value, isNull); + }); +} diff --git a/common/combinator/test/match_test.dart b/common/combinator/test/match_test.dart new file mode 100644 index 0000000..c588e9c --- /dev/null +++ b/common/combinator/test/match_test.dart @@ -0,0 +1,16 @@ +import 'package:platform_combinator/combinator.dart'; +import 'package:test/test.dart'; +import 'common.dart'; + +void main() { + test('match string', () { + expect(match('hello').parse(scan('hello world')).successful, isTrue); + }); + test('match start only', () { + expect(match('hello').parse(scan('goodbye hello')).successful, isFalse); + }); + + test('fail if no match', () { + expect(match('hello').parse(scan('world')).successful, isFalse); + }); +} diff --git a/common/combinator/test/misc_test.dart b/common/combinator/test/misc_test.dart new file mode 100644 index 0000000..e6348b9 --- /dev/null +++ b/common/combinator/test/misc_test.dart @@ -0,0 +1,65 @@ +import 'package:platform_combinator/combinator.dart'; +import 'package:test/test.dart'; +import 'common.dart'; + +void main() { + test('advance', () { + var scanner = scan('hello world'); + + // Casted -> dynamic just for the sake of coverage. + var parser = match('he').forward(2).castDynamic(); + parser.parse(scanner); + expect(scanner.position, 4); + }); + + test('change', () { + var parser = match('hello').change((r) => r.change(value: 23)); + expect(parser.parse(scan('helloworld')).value, 23); + }); + + test('check', () { + var parser = match(RegExp(r'[A-Za-z]+')) + .value((r) => r.span!.length) + .check(greaterThan(3)); + expect(parser.parse(scan('helloworld')).successful, isTrue); + expect(parser.parse(scan('yo')).successful, isFalse); + }); + + test('map', () { + var parser = match(RegExp(r'[A-Za-z]+')).map((r) => r.span!.length); + expect(parser.parse(scan('hello')).value, 5); + }); + + test('negate', () { + var parser = match('hello').negate(errorMessage: 'world'); + expect(parser.parse(scan('goodbye world')).successful, isTrue); + expect(parser.parse(scan('hello world')).successful, isFalse); + expect(parser.parse(scan('hello world')).errors.first.message, 'world'); + }); + + group('opt', () { + var single = match('hello').opt(backtrack: true); + var list = match('hel').then(match('lo')).opt(); + + test('succeeds if present', () { + expect(single.parse(scan('hello')).successful, isTrue); + expect(list.parse(scan('hello')).successful, isTrue); + }); + + test('succeeds if not present', () { + expect(single.parse(scan('goodbye')).successful, isTrue); + expect(list.parse(scan('goodbye')).successful, isTrue); + }); + + test('backtracks if not present', () { + for (var parser in [single, list]) { + var scanner = scan('goodbye'); + var pos = scanner.position; + parser.parse(scanner); + expect(scanner.position, pos); + } + }); + }); + + test('safe', () {}); +} diff --git a/common/combinator/test/recursion_test.dart b/common/combinator/test/recursion_test.dart new file mode 100644 index 0000000..d91cebb --- /dev/null +++ b/common/combinator/test/recursion_test.dart @@ -0,0 +1,53 @@ +void main() {} + +/* +void main() { + var number = match( RegExp(r'-?[0-9]+(\.[0-9]+)?')) + .map((r) => num.parse(r.span.text)); + + var term = reference(); + + var r = Recursion(); + + r.prefix = [number]; + + r.infix.addAll({ + match('*'): (l, r, _) => l * r, + match('/'): (l, r, _) => l / r, + match('+'): (l, r, _) => l + r, + match('-'): (l, r, _) => l - r, + + + match('-'): (l, r, _) => l - r, + match('+'): (l, r, _) => l + r, + match('/'): (l, r, _) => l / r, + match('*'): (l, r, _) => l * r, + + }); + + term.parser = r.precedence(0); + + num parse(String text) { + var scanner = SpanScanner(text); + var result = term.parse(scanner); + print(result.span.highlight()); + return result.value; + } + + test('prefix', () { + expect(parse('24'), 24); + }); + + test('infix', () { + expect(parse('12/6'), 2); + expect(parse('24+23'), 47); + expect(parse('24-23'), 1); + expect(parse('4*3'), 12); + }); + + test('precedence', () { + expect(parse('2+3*5*2'), 15); + //expect(parse('2+3+5-2*2'), 15); + }); +} +*/ diff --git a/common/combinator/test/value_test.dart b/common/combinator/test/value_test.dart new file mode 100644 index 0000000..d877c48 --- /dev/null +++ b/common/combinator/test/value_test.dart @@ -0,0 +1,15 @@ +import 'package:platform_combinator/combinator.dart'; +import 'package:test/test.dart'; +import 'common.dart'; + +void main() { + var parser = match('hello').value((r) => 'world'); + + test('sets value', () { + expect(parser.parse(scan('hello world')).value, 'world'); + }); + + test('no value if no match', () { + expect(parser.parse(scan('goodbye world')).value, isNull); + }); +} diff --git a/common/html_builder/AUTHORS.md b/common/html_builder/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/html_builder/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/html_builder/CHANGELOG.md b/common/html_builder/CHANGELOG.md new file mode 100644 index 0000000..2fbd07b --- /dev/null +++ b/common/html_builder/CHANGELOG.md @@ -0,0 +1,66 @@ +# Change Log + +## 5.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 5.1.0 + +* Updated `lints` to 3.0.0 + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 + +## 4.0.0 + +* Require Dart >= 2.17 +* Added hashCode + +## 3.0.2 + +* Fixed license link + +## 3.0.1 + +* Update broken link in pubspec + +## 3.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Removed deprecated parameters +* Published as `platform_html_builder` package + +## 2.0.3 + +* Added an example +* Updated README + +## 2.0.2 + +* Run `dartfmt -w .` + +## 2.0.1 + +* Added pedantic dart rules + +## 2.0.0 + +* Migrated to work with Dart SDK 2.12.x NNBD + +## 1.0.4 + +* Added `rebuild`, `rebuildRecursive`, and `NodeBuilder`. + +## 1.0.3 + +* Dart 2 ready! + +## 1.0.2 + +* Changed `h` and the `Node` constructor to take `Iterable`s of children, instead of just `List`s. diff --git a/packages/route/LICENSE b/common/html_builder/LICENSE similarity index 100% rename from packages/route/LICENSE rename to common/html_builder/LICENSE diff --git a/common/html_builder/README.md b/common/html_builder/README.md new file mode 100644 index 0000000..bcadafd --- /dev/null +++ b/common/html_builder/README.md @@ -0,0 +1,113 @@ +# Betaluk Html Builder + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_html_builder?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/html_builder/LICENSE) + +**Replacement of `package:html_builder` with breaking changes to support NNBD.** + +This package builds HTML AST's and renders them to HTML. It can be used as an internal DSL, i.e. for a templating engine. + +## Installation + +In your `pubspec.yaml`: + +```yaml +dependencies: + platform_html_builder: ^5.1.0 +``` + +## Usage + +```dart +import 'package:platform_html_builder/platform_html_builder.dart'; + +void main() { + // Akin to React.createElement(...); + var $el = h('my-element', p: {}, c: []); + + // Attributes can be plain Strings. + h('foo', p: { + 'bar': 'baz' + }); + + // Null attributes do not appear. + h('foo', p: { + 'does-not-appear': null + }); + + // If an attribute is a bool, then it will only appear if its value is true. + h('foo', p: { + 'appears': true, + 'does-not-appear': false + }); + + // Or, a String or Map. + h('foo', p: { + 'style': 'background-color: white; color: red;' + }); + + h('foo', p: { + 'style': { + 'background-color': 'white', + 'color': 'red' + } + }); + + // Or, a String or Iterable. + h('foo', p: { + 'class': 'a b' + }); + + h('foo', p: { + 'class': ['a', 'b'] + }); +} +``` + +Standard HTML5 elements: + +```dart +import 'package:platform_html_builder/elements.dart'; + +void main() { + var $dom = html(lang: 'en', c: [ + head(c: [ + title(c: [text('Hello, world!')]) + ]), + body(c: [ + h1(c: [text('Hello, world!')]), + p(c: [text('Ok')]) + ]) + ]); +} +``` + +Rendering to HTML: + +```dart +String html = StringRenderer().render($dom); +``` + +Example implementation with the [Angel3](https://pub.dev/packages/angel3_framework) backend framework, +which uses [dedicated html_builder package](https://github.com/dukefirehawk/angel/tree/html): + +```dart +import 'dart:io'; +import 'package:belatuk_framework/belatuk_framework.dart'; +import 'package:platform_html_builder/elements.dart'; + +configureViews(Angel app) async { + app.get('/foo/:id', (req, res) async { + var foo = await app.service('foo').read(req.params['id']); + return html(c: [ + head(c: [ + title(c: [text(foo.name)]) + ]), + body(c: [ + h1(c: [text(foo.name)]) + ]) + ]); + }); +} +``` diff --git a/common/html_builder/analysis_options.yaml b/common/html_builder/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/html_builder/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/html_builder/example/main.dart b/common/html_builder/example/main.dart new file mode 100644 index 0000000..0674032 --- /dev/null +++ b/common/html_builder/example/main.dart @@ -0,0 +1,15 @@ +import 'package:platform_html_builder/elements.dart'; + +void main() { + var dom = html(lang: 'en', c: [ + head(c: [ + title(c: [text('Hello, world!')]) + ]), + body(c: [ + h1(c: [text('Hello, world!')]), + p(c: [text('Ok')]) + ]) + ]); + + print(dom); +} diff --git a/common/html_builder/lib/elements.dart b/common/html_builder/lib/elements.dart new file mode 100644 index 0000000..e81e0ee --- /dev/null +++ b/common/html_builder/lib/elements.dart @@ -0,0 +1,1841 @@ +/// Helper functions to build common HTML5 elements. +library platform_html_builder.elements; + +import 'html_builder.dart'; +export 'html_builder.dart'; + +Map _apply(Iterable> props, + [Map? attrs]) { + var map = {}; + attrs?.forEach((k, attr) { + if (attr is String && attr.isNotEmpty == true) { + map[k] = attr; + } else if (attr is Iterable && attr.isNotEmpty == true) { + map[k] = attr.toList(); + } else if (attr != null) { + map[k] = attr; + } + }); + + for (var p in props) { + map.addAll(p); + } + + return map.cast(); +} + +Node text(String text) => TextNode(text); + +Node a( + {String? href, + String? rel, + String? target, + String? id, + className, + style, + Map p = const {}, + Iterable c = const []}) => + h( + 'a', + _apply([ + p, + ], { + 'href': href, + 'rel': rel, + 'target': target, + 'id': id, + 'class': className, + 'style': style, + }), + [...c]); + +Node abbr( + {String? title, + String? id, + className, + style, + Map p = const {}, + Iterable c = const []}) => + h( + 'addr', + _apply([p], + {'title': title, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node address({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('address', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node area({ + String? alt, + Iterable? coordinates, + String? download, + String? href, + String? hreflang, + String? media, + String? nohref, + String? rel, + String? shape, + String? target, + String? type, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'area', + _apply([ + p + ], { + 'alt': alt, + 'coordinates': coordinates, + 'download': download, + 'href': href, + 'hreflang': hreflang, + 'media': media, + 'nohref': nohref, + 'rel': rel, + 'shape': shape, + 'target': target, + 'type': type, + 'id': id, + 'class': className, + 'style': style + })); + +Node article({ + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('article', _apply([p], {'class': className, 'style': style}), [...c]); + +Node aside({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('aside', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node audio({ + bool? autoplay, + bool? controls, + bool? loop, + bool? muted, + String? preload, + String? src, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'audio', + _apply([ + p + ], { + 'autoplay': autoplay, + 'controls': controls, + 'loop': loop, + 'muted': muted, + 'preload': preload, + 'src': src, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node b({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('b', _apply([p], {'id': id, 'class': className, 'style': style}), [...c]); + +Node base({ + String? href, + String? target, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'base', + _apply([ + p + ], { + 'href': href, + 'target': target, + 'id': id, + 'class': className, + 'style': style + })); + +Node bdi({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('bdi', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node bdo({ + String? dir, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'bdo', + _apply([p], {'dir': dir, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node blockquote({ + String? cite, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'blockquote', + _apply( + [p], {'cite': cite, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node body({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('body', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node br() => SelfClosingNode('br'); + +Node button({ + bool? autofocus, + bool? disabled, + form, + String? formaction, + String? formenctype, + String? formmethod, + bool? formnovalidate, + String? formtarget, + String? name, + String? type, + String? value, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'button', + _apply([ + p + ], { + 'autofocus': autofocus, + 'disabled': disabled, + 'form': form, + 'formaction': formaction, + 'formenctype': formenctype, + 'formmethod': formmethod, + 'formnovalidate': formnovalidate, + 'formtarget': formtarget, + 'name': name, + 'type': type, + 'value': value, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node canvas({ + num? height, + num? width, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'canvas', + _apply([ + p + ], { + 'height': height, + 'width': width, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node cite({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('cite', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node caption({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('caption', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node code({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('code', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node col({ + num? span, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'col', + _apply( + [p], {'span': span, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node colgroup({ + num? span, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'colgroup', + _apply( + [p], {'span': span, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node datalist({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('datalist', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node dd({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('dd', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node del({ + String? cite, + String? datetime, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'del', + _apply([ + p + ], { + 'cite': cite, + 'datetime': datetime, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node details({ + bool? open, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'details', + _apply( + [p], {'open': open, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node dfn({ + String? title, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'dfn', + _apply([p], + {'title': title, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node dialog({ + bool? open, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'dialog', + _apply( + [p], {'open': open, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node div({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('div', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node dl({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('dl', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node dt({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('dt', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node em({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('em', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node embed({ + num? height, + String? src, + String? type, + num? width, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'embed', + _apply([ + p + ], { + 'height': height, + 'src': src, + 'type': type, + 'width': width, + 'id': id, + 'class': className, + 'style': style + })); + +Node fieldset({ + bool? disabled, + String? form, + String? name, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'fieldset', + _apply([ + p + ], { + 'disabled': disabled, + 'form': form, + 'name': name, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node figcaption({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('figcaption', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node figure({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('figure', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node footer({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('footer', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node form({ + String? accept, + String? acceptCharset, + String? action, + bool? autocomplete, + String? enctype, + String? method, + String? name, + bool? novalidate, + String? target, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'form', + _apply([ + p + ], { + 'accept': accept, + 'accept-charset': acceptCharset, + 'action': action, + 'autocomplete': + autocomplete != null ? (autocomplete ? 'on' : 'off') : null, + 'enctype': enctype, + 'method': method, + 'name': name, + 'novalidate': novalidate, + 'target': target, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node h1({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('h1', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node h2({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('h2', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); +Node h3({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('h3', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node h4({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('h4', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node h5({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('h5', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node h6({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('h6', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node head({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('head', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node header({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('header', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node hr() => SelfClosingNode('hr'); + +Node html({ + String? manifest, + String? xmlns, + String? lang, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'html', + _apply([ + p + ], { + 'manifest': manifest, + 'xmlns': xmlns, + 'lang': lang, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node i({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('i', _apply([p], {'id': id, 'class': className, 'style': style}), [...c]); + +Node iframe({ + num? height, + String? name, + sandbox, + String? src, + String? srcdoc, + num? width, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'iframe', + _apply([ + p + ], { + 'height': height, + 'name': name, + 'sandbox': sandbox, + 'src': src, + 'srcdoc': srcdoc, + 'width': width, + 'id': id, + 'class': className, + 'style': style + })); + +Node img({ + String? alt, + String? crossorigin, + num? height, + String? ismap, + String? longdesc, + sizes, + String? src, + String? srcset, + String? usemap, + num? width, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'img', + _apply([ + p + ], { + 'alt': alt, + 'crossorigin': crossorigin, + 'height': height, + 'ismap': ismap, + 'longdesc': longdesc, + 'sizes': sizes, + 'src': src, + 'srcset': srcset, + 'usemap': usemap, + 'width': width, + 'id': id, + 'class': className, + 'style': style + })); + +Node input({ + String? accept, + String? alt, + bool? autocomplete, + bool? autofocus, + bool? checked, + String? dirname, + bool? disabled, + String? form, + String? formaction, + String? formenctype, + String? method, + String? formnovalidate, + String? formtarget, + num? height, + String? list, + max, + num? maxlength, + min, + bool? multiple, + String? name, + String? pattern, + String? placeholder, + bool? readonly, + bool? required, + num? size, + String? src, + num? step, + String? type, + String? value, + num? width, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'input', + _apply([ + p + ], { + 'accept': accept, + 'alt': alt, + 'autocomplete': + autocomplete == null ? null : (autocomplete ? 'on' : 'off'), + 'autofocus': autofocus, + 'checked': checked, + 'dirname': dirname, + 'disabled': disabled, + 'form': form, + 'formaction': formaction, + 'formenctype': formenctype, + 'method': method, + 'formnovalidate': formnovalidate, + 'formtarget': formtarget, + 'height': height, + 'list': list, + 'max': max, + 'maxlength': maxlength, + 'min': min, + 'multiple': multiple, + 'name': name, + 'pattern': pattern, + 'placeholder': placeholder, + 'readonly': readonly, + 'required': required, + 'size': size, + 'src': src, + 'step': step, + 'type': type, + 'value': value, + 'width': width, + 'id': id, + 'class': className, + 'style': style + })); + +Node ins({ + String? cite, + String? datetime, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'ins', + _apply([ + p + ], { + 'cite': cite, + 'datetime': datetime, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node kbd({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('kbd', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node keygen({ + bool? autofocus, + String? challenge, + bool? disabled, + String? from, + String? keytype, + String? name, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'keygen', + _apply([ + p + ], { + 'autofocus': autofocus, + 'challenge': challenge, + 'disabled': disabled, + 'from': from, + 'keytype': keytype, + 'name': name, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node label({ + String? for_, + String? form, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'label', + _apply([ + p + ], { + 'for': for_, + 'form': form, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node legend({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('legend', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node li({ + num? value, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'li', + _apply([p], + {'value': value, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node link({ + String? crossorigin, + String? href, + String? hreflang, + String? media, + String? rel, + sizes, + String? target, + String? type, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'link', + _apply([ + p + ], { + 'crossorigin': crossorigin, + 'href': href, + 'hreflang': hreflang, + 'media': media, + 'rel': rel, + 'sizes': sizes, + 'target': target, + 'type': type, + 'id': id, + 'class': className, + 'style': style + })); + +Node main({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('main', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node map({ + String? name, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'map', + _apply( + [p], {'name': name, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node mark({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('mark', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node menu({ + String? label, + String? type, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'menu', + _apply([ + p + ], { + 'label': label, + 'type': type, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node menuitem({ + bool? checked, + command, + bool? default_, + bool? disabled, + String? icon, + String? label, + String? radiogroup, + String? type, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'menuitem', + _apply([ + p + ], { + 'checked': checked, + 'command': command, + 'default': default_, + 'disabled': disabled, + 'icon': icon, + 'label': label, + 'radiogroup': radiogroup, + 'type': type, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node meta({ + String? charset, + String? content, + String? httpEquiv, + String? name, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'meta', + _apply([ + p + ], { + 'charset': charset, + 'content': content, + 'http-equiv': httpEquiv, + 'name': name, + 'id': id, + 'class': className, + 'style': style + })); + +Node nav({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('nav', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node noscript({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('noscript', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node object({ + String? data, + String? form, + num? height, + String? name, + String? type, + String? usemap, + num? width, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'object', + _apply([ + p + ], { + 'data': data, + 'form': form, + 'height': height, + 'name': name, + 'type': type, + 'usemap': usemap, + 'width': width, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node ol({ + bool? reversed, + num? start, + String? type, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'ol', + _apply([ + p + ], { + 'reversed': reversed, + 'start': start, + 'type': type, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node optgroup({ + bool? disabled, + String? label, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'optgroup', + _apply([ + p + ], { + 'disabled': disabled, + 'label': label, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node option({ + bool? disabled, + String? label, + bool? selected, + String? value, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'option', + _apply([ + p + ], { + 'disabled': disabled, + 'label': label, + 'selected': selected, + 'value': value, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node output({ + String? for_, + String? form, + String? name, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'output', + _apply([ + p + ], { + 'for': for_, + 'form': form, + 'name': name, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node p({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('p', _apply([p], {'id': id, 'class': className, 'style': style}), [...c]); + +Node param({ + String? name, + value, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'param', + _apply([ + p + ], { + 'name': name, + 'value': value, + 'id': id, + 'class': className, + 'style': style + })); + +Node picture({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('picture', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node pre({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('pre', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node progress({ + num? max, + num? value, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'progress', + _apply([ + p + ], { + 'max': max, + 'value': value, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node q({ + String? cite, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'q', + _apply( + [p], {'cite': cite, 'id': id, 'class': className, 'style': style}), + [...c]); + +Node rp({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('rp', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node rt({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('rt', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node ruby({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('ruby', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node s({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('s', _apply([p], {'id': id, 'class': className, 'style': style}), [...c]); + +Node samp({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('samp', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node script({ + bool? async, + String? charset, + bool? defer, + String? src, + String? type, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'script', + _apply([ + p + ], { + 'async': async, + 'charset': charset, + 'defer': defer, + 'src': src, + 'type': type, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node section({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('section', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node select({ + bool? autofocus, + bool? disabled, + String? form, + bool? multiple, + bool? required, + num? size, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'select', + _apply([ + p + ], { + 'autofocus': autofocus, + 'disabled': disabled, + 'form': form, + 'multiple': multiple, + 'required': required, + 'size': size, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node small({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('small', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node source({ + String? src, + String? srcset, + String? media, + sizes, + String? type, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'source', + _apply([ + p + ], { + 'src': src, + 'srcset': srcset, + 'media': media, + 'sizes': sizes, + 'type': type, + 'id': id, + 'class': className, + 'style': style + })); + +Node span({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('span', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node strong({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('strong', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node style({ + String? media, + bool? scoped, + String? type, + String? id, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'style', + _apply([p], {'media': media, 'scoped': scoped, 'type': type, 'id': id}), + [...c]); + +Node sub({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('sub', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node summary({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('summary', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node sup({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('sup', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node table({ + bool? sortable, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'table', + _apply([ + p + ], { + 'sortable': sortable, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node tbody({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('tbody', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node td({ + num? colspan, + headers, + num? rowspan, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'td', + _apply([ + p + ], { + 'colspan': colspan, + 'headers': headers, + 'rowspan': rowspan, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node textarea({ + bool? autofocus, + num? cols, + String? dirname, + bool? disabled, + String? form, + num? maxlength, + String? name, + String? placeholder, + bool? readonly, + bool? required, + num? rows, + String? wrap, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'textarea', + _apply([ + p + ], { + 'autofocus': autofocus, + 'cols': cols, + 'dirname': dirname, + 'disabled': disabled, + 'form': form, + 'maxlength': maxlength, + 'name': name, + 'placeholder': placeholder, + 'readonly': readonly, + 'required': required, + 'rows': rows, + 'wrap': wrap, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node tfoot({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('tfoot', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node th({ + String? abbr, + num? colspan, + headers, + num? rowspan, + String? scope, + sorted, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'th', + _apply([ + p + ], { + 'abbr': abbr, + 'colspan': colspan, + 'headers': headers, + 'rowspan': rowspan, + 'scope': scope, + 'sorted': sorted, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node thead({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('thead', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node time({ + String? datetime, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'time', + _apply([ + p + ], { + 'datetime': datetime, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node title({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('title', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node tr({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('tr', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node track({ + bool? default_, + String? kind, + String? label, + String? src, + String? srclang, + String? id, + className, + style, + Map p = const {}, +}) => + SelfClosingNode( + 'track', + _apply([ + p + ], { + 'default': default_, + 'kind': kind, + 'label': label, + 'src': src, + 'srclang': srclang, + 'id': id, + 'class': className, + 'style': style + })); + +Node u({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('u', _apply([p], {'id': id, 'class': className, 'style': style}), [...c]); + +Node ul({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('ul', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node var_({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('var', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); + +Node video({ + bool? autoplay, + bool? controls, + num? height, + bool? loop, + bool? muted, + String? poster, + String? preload, + String? src, + num? width, + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h( + 'video', + _apply([ + p + ], { + 'autoplay': autoplay, + 'controls': controls, + 'height': height, + 'loop': loop, + 'muted': muted, + 'poster': poster, + 'preload': preload, + 'src': src, + 'width': width, + 'id': id, + 'class': className, + 'style': style + }), + [...c]); + +Node wbr({ + String? id, + className, + style, + Map p = const {}, + Iterable c = const [], +}) => + h('wbr', _apply([p], {'id': id, 'class': className, 'style': style}), + [...c]); diff --git a/common/html_builder/lib/html_builder.dart b/common/html_builder/lib/html_builder.dart new file mode 100644 index 0000000..ebf01f4 --- /dev/null +++ b/common/html_builder/lib/html_builder.dart @@ -0,0 +1,4 @@ +export 'src/mutations.dart'; +export 'src/node.dart'; +export 'src/node_builder.dart'; +export 'src/renderer.dart'; diff --git a/common/html_builder/lib/src/mutations.dart b/common/html_builder/lib/src/mutations.dart new file mode 100644 index 0000000..f3cdc7e --- /dev/null +++ b/common/html_builder/lib/src/mutations.dart @@ -0,0 +1,20 @@ +import 'node.dart'; +import 'node_builder.dart'; + +/// Returns a function that rebuilds an arbitrary [Node] by applying the [transform] to it. +Node Function(Node) rebuild(NodeBuilder Function(NodeBuilder) transform, + {bool selfClosing = false}) { + return (node) => + transform(NodeBuilder.from(node)).build(selfClosing: selfClosing); +} + +/// Applies [f] to all children of this node, recursively. +/// +/// Use this alongside [rebuild]. +Node Function(Node) rebuildRecursive(Node Function(Node) f) { + Node build(Node node) { + return NodeBuilder.from(f(node)).mapChildren(build).build(); + } + + return build; +} diff --git a/common/html_builder/lib/src/node.dart b/common/html_builder/lib/src/node.dart new file mode 100644 index 0000000..8e3cae4 --- /dev/null +++ b/common/html_builder/lib/src/node.dart @@ -0,0 +1,73 @@ +import 'package:collection/collection.dart'; + +/// Shorthand function to generate a new [Node]. +Node h(String tagName, + [Map attributes = const {}, + Iterable children = const []]) => + Node(tagName, attributes, children); + +/// Represents an HTML node. +class Node { + final String tagName; + final Map attributes = {}; + final List children = []; + + Node(this.tagName, + [Map attributes = const {}, + Iterable children = const []]) { + this + ..attributes.addAll(attributes) + ..children.addAll(children); + } + + Node._selfClosing(this.tagName, + [Map attributes = const {}]) { + this.attributes.addAll(attributes); + } + + @override + bool operator ==(other) { + return other is Node && + other.tagName == tagName && + const ListEquality().equals(other.children, children) && + const MapEquality() + .equals(other.attributes, attributes); + } + + @override + int get hashCode { + int hash = Object.hash(tagName, Object.hashAll(children)); + return Object.hash(hash, Object.hashAll(attributes.values)); + } +} + +/// Represents a self-closing tag, i.e. `
`. +class SelfClosingNode extends Node { + /* + @override + final String tagName; + + @override + final Map attributes = {}; + */ + + @override + List get children => List.unmodifiable([]); + + // ignore: use_super_parameters + SelfClosingNode(String tagName, [Map attributes = const {}]) + : super._selfClosing(tagName, attributes); +} + +/// Represents a text node. +class TextNode extends Node { + final String text; + + TextNode(this.text) : super(':text'); + + @override + bool operator ==(other) => other is TextNode && other.text == text; + + @override + int get hashCode => text.hashCode; +} diff --git a/common/html_builder/lib/src/node_builder.dart b/common/html_builder/lib/src/node_builder.dart new file mode 100644 index 0000000..ab566c8 --- /dev/null +++ b/common/html_builder/lib/src/node_builder.dart @@ -0,0 +1,108 @@ +import 'node.dart'; + +/// Helper class to build nodes. +class NodeBuilder { + final String tagName; + final Map attributes; + final Iterable children; + Node? _existing; + + NodeBuilder(this.tagName, + {this.attributes = const {}, this.children = const []}); + + /// Creates a [NodeBuilder] that just spits out an already-existing [Node]. + factory NodeBuilder.existing(Node existingNode) => + NodeBuilder(existingNode.tagName).._existing = existingNode; + + factory NodeBuilder.from(Node node) => NodeBuilder(node.tagName, + attributes: Map.from(node.attributes), + children: List.from(node.children)); + + /// Builds the node. + Node build({bool selfClosing = false}) => + _existing ?? + (selfClosing + ? SelfClosingNode(tagName, attributes) + : Node(tagName, attributes, children)); + + /// Produce a modified copy of this builder. + NodeBuilder change( + {String? tagName, + Map? attributes, + Iterable? children}) { + return NodeBuilder(tagName ?? this.tagName, + attributes: attributes ?? this.attributes, + children: children ?? this.children); + } + + NodeBuilder changeTagName(String tagName) => change(tagName: tagName); + + NodeBuilder changeAttributes(Map attributes) => + change(attributes: attributes); + + NodeBuilder changeChildren(Iterable children) => + change(children: children); + + NodeBuilder changeAttributesMapped( + Map Function(Map) f) { + var map = Map.from(attributes); + return changeAttributes(f(map)); + } + + NodeBuilder changeChildrenMapped(Iterable Function(List) f) { + var list = List.from(children); + return changeChildren(f(list)); + } + + NodeBuilder mapChildren(Node Function(Node) f) => + changeChildrenMapped((list) => list.map(f)); + + NodeBuilder mapAttributes( + MapEntry Function(String, dynamic) f) => + changeAttributesMapped((map) => map.map(f)); + + NodeBuilder setAttribute(String name, dynamic value) => + changeAttributesMapped((map) => map..[name] = value); + + NodeBuilder addChild(Node child) => + changeChildrenMapped((list) => list..add(child)); + + NodeBuilder removeChild(Node child) => + changeChildrenMapped((list) => list..remove(child)); + + NodeBuilder removeAttribute(String name) => + changeAttributesMapped((map) => map..remove(name)); + + NodeBuilder setId(String id) => setAttribute('id', id); + + NodeBuilder setClassName(String className) => + setAttribute('class', className); + + NodeBuilder setClasses(Iterable classes) => + setClassName(classes.join(' ')); + + NodeBuilder setClassesMapped(Iterable Function(List) f) { + var clazz = attributes['class']; + var classes = []; + + if (clazz is String) { + classes.addAll(clazz.split(' ')); + } else if (clazz is Iterable) { + classes.addAll(clazz.map((s) => s.toString())); + } + + return setClasses(f(classes)); + } + + NodeBuilder addClass(String className) => setClassesMapped( + (classes) => classes.contains(className) ? classes : classes + ..add(className)); + + NodeBuilder removeClass(String className) => + setClassesMapped((classes) => classes..remove(className)); + + NodeBuilder toggleClass(String className) => + setClassesMapped((classes) => classes.contains(className) + ? (classes..remove(className)) + : (classes..add(className))); +} diff --git a/common/html_builder/lib/src/renderer.dart b/common/html_builder/lib/src/renderer.dart new file mode 100644 index 0000000..420165f --- /dev/null +++ b/common/html_builder/lib/src/renderer.dart @@ -0,0 +1,140 @@ +import 'node.dart'; + +/// An object that can render a DOM tree into another representation, i.e. a `String`. +abstract class Renderer { + /// Renders a DOM tree into another representation. + T render(Node rootNode); +} + +/// Renders a DOM tree into a HTML string. +abstract class StringRenderer implements Renderer { + /// Initializes a new [StringRenderer]. + /// + /// If [html5] is not `false` (default: `true`), then self-closing elements will be rendered with a slash before the last angle bracket, ex. `
`. + /// If [pretty] is `true` (default), then [whitespace] (default: `' '`) will be inserted between nodes. + /// You can also provide a [doctype] (default: `html`). + factory StringRenderer( + {bool html5 = true, + bool pretty = true, + String doctype = 'html', + String whitespace = ' '}) => + pretty == true + ? _PrettyStringRendererImpl( + html5: html5 != false, doctype: doctype, whitespace: whitespace) + : _StringRendererImpl(html5: html5 != false, doctype: doctype); +} + +class _StringRendererImpl implements StringRenderer { + final String? doctype; + final bool? html5; + + _StringRendererImpl({this.html5, this.doctype}); + + void _renderInto(Node node, StringBuffer buf) { + if (node is TextNode) { + buf.write(node.text); + } else { + buf.write('<${node.tagName}'); + + node.attributes.forEach((k, v) { + if (v == true) { + buf.write(' $k'); + } else if (v == false || v == null) { + // Ignore + } else if (v is Iterable) { + var val = v.join(' ').replaceAll('"', '\\"'); + buf.write(' $k="$val"'); + } else if (v is Map) { + var val = v.keys + .fold('', (out, k) => out += '$k: ${v[k]};') + .replaceAll('"', '\\"'); + buf.write(' $k="$val"'); + } else { + var val = v.toString().replaceAll('"', '\\"'); + buf.write(' $k="$val"'); + } + }); + + if (node is SelfClosingNode) { + buf.write((html5 != false) ? '>' : '/>'); + } else { + buf.write('>'); + for (var child in node.children) { + _renderInto(child, buf); + } + buf.write(''); + } + } + } + + @override + String render(Node rootNode) { + var buf = StringBuffer(); + if (doctype?.isNotEmpty == true) buf.write(''); + _renderInto(rootNode, buf); + return buf.toString(); + } +} + +class _PrettyStringRendererImpl implements StringRenderer { + final bool? html5; + final String? doctype, whitespace; + + _PrettyStringRendererImpl({this.html5, this.whitespace, this.doctype}); + + void _applyTabs(int tabs, StringBuffer buf) { + for (var i = 0; i < tabs; i++) { + buf.write(whitespace ?? ' '); + } + } + + void _renderInto(int tabs, Node node, StringBuffer buf) { + if (tabs > 0) buf.writeln(); + _applyTabs(tabs, buf); + + if (node is TextNode) { + buf.write(node.text); + } else { + buf.write('<${node.tagName}'); + + node.attributes.forEach((k, v) { + if (v == true) { + buf.write(' $k'); + } else if (v == false || v == null) { + // Ignore + } else if (v is Iterable) { + var val = v.join(' ').replaceAll('"', '\\"'); + buf.write(' $k="$val"'); + } else if (v is Map) { + var val = v.keys + .fold('', (out, k) => out += '$k: ${v[k]};') + .replaceAll('"', '\\"'); + buf.write(' $k="$val"'); + } else { + var val = v.toString().replaceAll('"', '\\"'); + buf.write(' $k="$val"'); + } + }); + + if (node is SelfClosingNode) { + buf.write((html5 != false) ? '>' : '/>'); + } else { + buf.write('>'); + for (var child in node.children) { + _renderInto(tabs + 1, child, buf); + } + buf.writeln(); + _applyTabs(tabs, buf); + buf.write(''); + } + } + } + + @override + String render(Node rootNode) { + var buf = StringBuffer(); + if (doctype?.isNotEmpty == true) buf.writeln(''); + _renderInto(0, rootNode, buf); + return buf.toString(); + } +} diff --git a/common/html_builder/pubspec.lock b/common/html_builder/pubspec.lock new file mode 100644 index 0000000..51c766e --- /dev/null +++ b/common/html_builder/pubspec.lock @@ -0,0 +1,418 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: "direct main" + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + csslib: + dependency: transitive + description: + name: csslib + sha256: "09bad715f418841f976c77db72d5398dc1253c21fb9c0c7f0b0b985860b2d58e" + url: "https://pub.dev" + source: hosted + version: "1.0.2" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + html: + dependency: "direct dev" + description: + name: html + sha256: "1fc58edeaec4307368c60d59b7e15b9d658b57d7f3125098b6294153c75337ec" + url: "https://pub.dev" + source: hosted + version: "0.15.5" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/html_builder/pubspec.yaml b/common/html_builder/pubspec.yaml new file mode 100644 index 0000000..a28effb --- /dev/null +++ b/common/html_builder/pubspec.yaml @@ -0,0 +1,12 @@ +name: platform_html_builder +version: 5.2.0 +description: Build HTML AST's and render them to HTML. This can be used as an internal DSL, i.e. for a templating engine. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/html_builder +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + collection: ^1.17.0 +dev_dependencies: + html: ^0.15.0 + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/html_builder/test/render_test.dart b/common/html_builder/test/render_test.dart new file mode 100644 index 0000000..47a8915 --- /dev/null +++ b/common/html_builder/test/render_test.dart @@ -0,0 +1,33 @@ +import 'package:html/parser.dart' as html5; +import 'package:platform_html_builder/elements.dart'; +import 'package:test/test.dart'; + +void main() { + test('pretty', () { + var $dom = html( + lang: 'en', + c: [ + head(c: [ + title(c: [text('Hello, world!')]) + ]), + body( + p: {'unresolved': true}, + c: [ + h1(c: [text('Hello, world!')]), + br(), + hr(), + ], + ) + ], + ); + + var rendered = StringRenderer().render($dom); + print(rendered); + + var $parsed = html5.parse(rendered); + var $title = $parsed.querySelector('title')!; + expect($title.text.trim(), 'Hello, world!'); + var $h1 = $parsed.querySelector('h1')!; + expect($h1.text.trim(), 'Hello, world!'); + }); +} diff --git a/common/http_server/.gitignore b/common/http_server/.gitignore new file mode 100644 index 0000000..0a77d85 --- /dev/null +++ b/common/http_server/.gitignore @@ -0,0 +1,5 @@ +.dart_tool +.packages +.metals +.vscode +pubspec.lock diff --git a/common/http_server/.test_config b/common/http_server/.test_config new file mode 100644 index 0000000..2535563 --- /dev/null +++ b/common/http_server/.test_config @@ -0,0 +1,3 @@ +{ + "test_package": true +} diff --git a/common/http_server/AUTHORS.md b/common/http_server/AUTHORS.md new file mode 100644 index 0000000..0fc07f3 --- /dev/null +++ b/common/http_server/AUTHORS.md @@ -0,0 +1,9 @@ +# Authors + +## Current + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + +## Previous + +* Google Inc. diff --git a/common/http_server/CHANGELOG.md b/common/http_server/CHANGELOG.md new file mode 100644 index 0000000..8277cea --- /dev/null +++ b/common/http_server/CHANGELOG.md @@ -0,0 +1,112 @@ +# Change Log + +## 4.5.1 + +* Fixed linter warnings + +## 4.5.0 + +* Require Dart >= 3.5 +* Updated `lints` to 5.0.0 +* Updated `mime` to 2.0.0 + +## 4.4.0 + +* Updated `lints` to 4.0.0 + +## 4.3.0 + +* Require Dart >= 3.3 + +## 4.2.0 + +* Upgraded `lints` to 3.0.0 + +## 4.1.2 + +* Fixed `_VirtualDirectoryFileStream` class due to change in Dart 3.0.3 + +## 4.1.1 + +* Updated README + +## 4.1.0 + +* Upgraded `test_api` to 0.6.0 + +## 4.0.0 + +* Require Dart >= 3.0 + +## 4.0.0-beta.1 + +* Require Dart >= 3.0 +* Fixed linter warnings + +## 3.0.0 + +* Require Dart >= 2.17 +* Fixed analyzer warnings + +## 2.1.0 + +* Updated linter to `package:lints` + +## 2.0.2 + +* Transfered repository to `dart-backend` + +## 2.0.1 + +* Added example + +## 2.0.0 + +* Migrated to `platform_http_server` + +## 1.0.0 + +* Migrate to null safety. +* Allow multipart form data with specified encodings that don't require + decoding. + +## 0.9.8+3 + +* Prepare for `HttpClientResponse` SDK change (implements `Stream` + rather than `Stream>`). + +## 0.9.8+2 + +* Prepare for `File.openRead()` SDK change in signature. + +## 0.9.8+1 + +* Fix a Dart 2 type issue. + +## 0.9.8 + +* Updates to support Dart 2 constants. + +## 0.9.7 + +* Updates to support Dart 2.0 core library changes (wave + 2.2). See [issue 31847][sdk#31847] for details. + + [sdk#31847]: https://github.com/dart-lang/sdk/issues/31847 + +## 0.9.6 + +* Updated the secure networking code to the SDKs version 1.15 SecurityContext api + +## 0.9.5+1 + +* Updated the layout of package contents. + +## 0.9.5 + +* Removed the decoding of HTML entity values (in the form &#xxxxx;) for + values when parsing multipart/form-post requests. + +## 0.9.4 + +* Fixed bugs in the handling of the Range header diff --git a/common/http_server/CONTRIBUTING.md b/common/http_server/CONTRIBUTING.md new file mode 100644 index 0000000..23c48ac --- /dev/null +++ b/common/http_server/CONTRIBUTING.md @@ -0,0 +1,3 @@ +# Contributing + +Any contributions from the community are welcome. diff --git a/common/http_server/LICENSE b/common/http_server/LICENSE new file mode 100644 index 0000000..ce4a4be --- /dev/null +++ b/common/http_server/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dart-backend +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/common/http_server/LICENSE.orig b/common/http_server/LICENSE.orig new file mode 100644 index 0000000..633672a --- /dev/null +++ b/common/http_server/LICENSE.orig @@ -0,0 +1,27 @@ +Copyright 2015, the Dart project authors. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are +met: + + * Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + * Redistributions in binary form must reproduce the above + copyright notice, this list of conditions and the following + disclaimer in the documentation and/or other materials provided + with the distribution. + * Neither the name of Google LLC nor the names of its + contributors may be used to endorse or promote products derived + from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS +"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT +LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR +A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT +OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, +SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT +LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, +DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY +THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/common/http_server/README.md b/common/http_server/README.md new file mode 100644 index 0000000..d656882 --- /dev/null +++ b/common/http_server/README.md @@ -0,0 +1,11 @@ +# Utility classes for HTTP server + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_http_server?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/platform_http_server)](https://github.com/dart-backend/platform_http_server/blob/master/LICENSE) + +**Forked from the archived `http_server` project** + +A set of high-level classes that, together with `HttpServer`, makes is easier to serve web content. + +**NOTE:** This package only works for server-side or command-line Dart applications. In other words, if the app imports `dart:io`, it can use this package. diff --git a/packages/route/analysis_options.yaml b/common/http_server/analysis_options.yaml similarity index 100% rename from packages/route/analysis_options.yaml rename to common/http_server/analysis_options.yaml diff --git a/common/http_server/example/main.dart b/common/http_server/example/main.dart new file mode 100644 index 0000000..315088c --- /dev/null +++ b/common/http_server/example/main.dart @@ -0,0 +1,25 @@ +import 'dart:convert'; +import 'dart:io'; + +import 'package:platform_http_server/http_server.dart'; + +void main() async { + var server = await HttpServer.bind('localhost', 8080); + server.transform(HttpBodyHandler(defaultEncoding: utf8)).listen((body) { + switch (body.type) { + case 'text': + print(body.body); + break; + + case 'json': + print(body.body); + break; + + default: + throw StateError('bad body type'); + } + body.request.response.close(); + }, onError: (Object error) { + throw StateError('bad connection'); + }); +} diff --git a/common/http_server/lib/http_server.dart b/common/http_server/lib/http_server.dart new file mode 100644 index 0000000..5a4e7d1 --- /dev/null +++ b/common/http_server/lib/http_server.dart @@ -0,0 +1,64 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +/// Utilities for working with `HttpServer`. +/// +/// ## Example +/// +/// Serving all files from the current directory. +/// +/// ```dart +/// import 'dart:io'; +/// +/// import 'package:http_server/http_server.dart'; +/// +/// Future serveCurrentDirectory() async { +/// var staticFiles = VirtualDirectory('.')..allowDirectoryListing = true; +/// +/// var server = await HttpServer.bind('0.0.0.0', 7777); +/// print('Server running'); +/// server.listen(staticFiles.serveRequest); +/// } +/// ``` +/// +/// ## Virtual directory +/// +/// The [VirtualDirectory] class makes it easy to serve static content +/// from the file system. It supports: +/// +/// * Range-based requests. +/// * If-Modified-Since based caching. +/// * Automatic GZip-compression of content. +/// * Following symlinks, either throughout the system or inside +/// a jailed root. +/// * Directory listing. +/// +/// See [VirtualDirectory] for more information. +/// +/// ## Virtual host +/// +/// The [VirtualHost] class helps to serve multiple hosts on the same +/// address, by using the `Host` field of the incoming requests. It also +/// works with wildcards for sub-domains. +/// +/// ```dart +/// var virtualHost = new VirtualHost(server); +/// // Filter out on a specific host +/// var stream1 = virtualServer.addHost('static.myserver.com'); +/// // Wildcard for any other sub-domains. +/// var stream2 = virtualServer.addHost('*.myserver.com'); +/// // Requests not matching any hosts. +/// var stream3 = virtualServer.unhandled; +/// ``` +/// +/// See [VirtualHost] for more information. +library http_server; + +import 'src/virtual_directory.dart'; +import 'src/virtual_host.dart'; + +export 'src/http_body.dart'; +export 'src/http_multipart_form_data.dart'; +export 'src/virtual_directory.dart'; +export 'src/virtual_host.dart'; diff --git a/common/http_server/lib/src/has_current_iterator.dart b/common/http_server/lib/src/has_current_iterator.dart new file mode 100644 index 0000000..07ed029 --- /dev/null +++ b/common/http_server/lib/src/has_current_iterator.dart @@ -0,0 +1,27 @@ +// Copyright (c) 2021, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +/// [Iterator] which wraps another [Iterator] and allows you to ask if there +/// is a valid value in [current]. +class HasCurrentIterator implements Iterator { + final Iterator _iterator; + + /// The result of the last call to [moveNext]. + bool _hasCurrent = false; + + /// Whether or not `current` has a valid value. + /// + /// This starts out as `false`, and then stores the value of the previous + /// `moveNext` call. + bool get hasCurrent => _hasCurrent; + + HasCurrentIterator(this._iterator); + + /// Must be called before reading [current] or [hasCurrent]. + @override + bool moveNext() => _hasCurrent = _iterator.moveNext(); + + @override + E get current => _iterator.current; +} diff --git a/common/http_server/lib/src/http_body.dart b/common/http_server/lib/src/http_body.dart new file mode 100644 index 0000000..27d4c28 --- /dev/null +++ b/common/http_server/lib/src/http_body.dart @@ -0,0 +1,316 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:convert'; +import 'dart:io' hide BytesBuilder; +import 'dart:typed_data'; + +import 'package:mime/mime.dart'; + +import 'http_multipart_form_data.dart'; + +/// A handler for processing and collecting HTTP message data in to an +/// [HttpBody]. +/// +/// The content body is parsed, depending on the `Content-Type` header field. +/// When the full body is read and parsed the body content is made available. +/// The class can be used to process both server requests and client responses. +/// +/// The following content types are recognized: +/// +/// - text/* +/// - application/json +/// - application/x-www-form-urlencoded +/// - multipart/form-data +/// +/// For content type `text/*` the body is decoded into a string. The +/// 'charset' parameter of the content type specifies the encoding +/// used for decoding. If no 'charset' is present the default encoding +/// of ISO-8859-1 is used. +/// +/// For content type `application/json` the body is decoded into a +/// string which is then parsed as JSON. The resulting body is a +/// [Map]. The 'charset' parameter of the content type specifies the +/// encoding used for decoding. If no 'charset' is present the default +/// encoding of UTF-8 is used. +/// +/// For content type `application/x-www-form-urlencoded` the body is a +/// query string which is then split according to the rules for +/// splitting a query string. The resulting body is a `Map`. If the same name is present several times in the query +/// string, then the last value seen for this name will be in the +/// resulting map. The encoding US-ASCII is always used for decoding +/// the body. +/// +/// For content type `multipart/form-data` the body is parsed into +/// it's different fields. The resulting body is a `Map`, where the value is a [String] for normal fields and a +/// [HttpBodyFileUpload] instance for file upload fields. If the same +/// name is present several times, then the last value seen for this +/// name will be in the resulting map. +/// +/// When using content type `multipart/form-data` the encoding of +/// fields with [String] values is determined by the browser sending +/// the HTTP request with the form data. The encoding is specified +/// either by the attribute `accept-charset` on the HTML form, or by +/// the content type of the web page containing the form. If the HTML +/// form has an `accept-charset` attribute the browser will use the +/// encoding specified there. If the HTML form has no `accept-charset` +/// attribute the browser determines the encoding from the content +/// type of the web page containing the form. Using a content type of +/// `text/html; charset=utf-8` for the page and setting +/// `accept-charset` on the HTML form to `utf-8` is recommended as the +/// default for [HttpBodyHandler] is UTF-8. It is important to get +/// these encoding values right, as the actual `multipart/form-data` +/// HTTP request sent by the browser does _not_ contain any +/// information on the encoding. If something else than UTF-8 is used +/// `defaultEncoding` needs to be set in the [HttpBodyHandler] +/// constructor and calls to [processRequest] and [processResponse]. +/// +/// For all other content types the body will be treated as +/// uninterpreted binary data. The resulting body will be of type +/// `List`. +/// +/// To use with the [HttpServer] for request messages, [HttpBodyHandler] can be +/// used as either a [StreamTransformer] or as a per-request handler (see +/// [processRequest]). +/// +/// ```dart +/// HttpServer server = ... +/// server.transform(HttpBodyHandler()) +/// .listen((HttpRequestBody body) { +/// ... +/// }); +/// ``` +/// +/// To use with the [HttpClient] for response messages, [HttpBodyHandler] can be +/// used as a per-request handler (see [processResponse]). +/// +/// ```dart +/// HttpClient client = ... +/// var request = await client.get(...); +/// var response = await request.close(); +/// var body = HttpBodyHandler.processResponse(response); +/// ``` +class HttpBodyHandler + extends StreamTransformerBase { + final Encoding _defaultEncoding; + + /// Create a new [HttpBodyHandler] to be used with a [Stream]<[HttpRequest]>, + /// e.g. a [HttpServer]. + /// + /// If the page is served using different encoding than UTF-8, set + /// [defaultEncoding] accordingly. This is required for parsing + /// `multipart/form-data` content correctly. See the class comment + /// for more information on `multipart/form-data`. + HttpBodyHandler({Encoding defaultEncoding = utf8}) + : _defaultEncoding = defaultEncoding; + + /// Process and parse an incoming [HttpRequest]. + /// + /// The returned [HttpRequestBody] contains a `response` field for accessing + /// the [HttpResponse]. + /// + /// See [HttpBodyHandler] for more info on [defaultEncoding]. + static Future processRequest(HttpRequest request, + {Encoding defaultEncoding = utf8}) async { + try { + var body = await _process(request, request.headers, defaultEncoding); + return HttpRequestBody._(request, body); + } catch (e) { + // Try to send BAD_REQUEST response. + request.response.statusCode = HttpStatus.badRequest; + await request.response.close(); + rethrow; + } + } + + /// Process and parse an incoming [HttpClientResponse]. + /// + /// See [HttpBodyHandler] for more info on [defaultEncoding]. + static Future processResponse( + HttpClientResponse response, + {Encoding defaultEncoding = utf8}) async { + var body = await _process(response, response.headers, defaultEncoding); + return HttpClientResponseBody._(response, body); + } + + @override + Stream bind(Stream stream) { + var pending = 0; + var closed = false; + return stream.transform( + StreamTransformer.fromHandlers(handleData: (request, sink) async { + pending++; + try { + var body = + await processRequest(request, defaultEncoding: _defaultEncoding); + sink.add(body); + } catch (e, st) { + sink.addError(e, st); + } finally { + pending--; + if (closed && pending == 0) sink.close(); + } + }, handleDone: (sink) { + closed = true; + if (pending == 0) sink.close(); + })); + } +} + +/// A HTTP content body produced by [HttpBodyHandler] for either [HttpRequest] +/// or [HttpClientResponse]. +class HttpBody { + /// A high-level type value, that reflects how the body was parsed, e.g. + /// "text", "binary" and "json". + final String type; + + /// The content of the body with a type depending on [type]. + final dynamic body; + + HttpBody._(this.type, this.body); +} + +/// The body of a [HttpClientResponse]. +/// +/// Headers can be read through the original [response]. +class HttpClientResponseBody extends HttpBody { + /// The wrapped response. + final HttpClientResponse response; + + HttpClientResponseBody._(this.response, HttpBody body) + : super._(body.type, body.body); +} + +/// The body of a [HttpRequest]. +/// +/// Headers can be read, and a response can be sent, through [request]. +class HttpRequestBody extends HttpBody { + /// The wrapped request. + /// + /// Note that the [HttpRequest] is already drained, so the + /// `Stream` methods cannot be used. + final HttpRequest request; + + HttpRequestBody._(this.request, HttpBody body) + : super._(body.type, body.body); +} + +/// A wrapper around a file upload. +class HttpBodyFileUpload { + /// The filename of the uploaded file. + final String filename; + + /// The [ContentType] of the uploaded file. + /// + /// For `text/*` and `application/json` the [content] field will a String. + final ContentType? contentType; + + /// The content of the file. + /// + /// Either a [String] or a [List]. + final dynamic content; + + HttpBodyFileUpload._(this.contentType, this.filename, this.content); +} + +Future _process(Stream> stream, HttpHeaders headers, + Encoding defaultEncoding) async { + Future asBinary() async { + var builder = await stream.fold( + BytesBuilder(), (builder, data) => builder..add(data)); + return HttpBody._('binary', builder.takeBytes()); + } + + if (headers.contentType == null) { + return asBinary(); + } + + var contentType = headers.contentType!; + + Future asText(Encoding defaultEncoding) async { + Encoding? encoding; + var charset = contentType.charset; + if (charset != null) encoding = Encoding.getByName(charset); + encoding ??= defaultEncoding; + var buffer = await encoding.decoder + .bind(stream) + .fold(StringBuffer(), (dynamic buffer, data) => buffer..write(data)); + return HttpBody._('text', buffer.toString()); + } + + Future asFormData() async { + var values = await MimeMultipartTransformer( + contentType.parameters['boundary']!) + .bind(stream) + .map((part) => + HttpMultipartFormData.parse(part, defaultEncoding: defaultEncoding)) + .map((multipart) async { + dynamic data; + if (multipart.isText) { + var buffer = await multipart.fold( + StringBuffer(), (b, s) => b..write(s)); + data = buffer.toString(); + } else { + var buffer = await multipart.fold( + BytesBuilder(), (b, d) => b..add(d as List)); + data = buffer.takeBytes(); + } + var filename = multipart.contentDisposition.parameters['filename']; + if (filename != null) { + data = HttpBodyFileUpload._(multipart.contentType, filename, data); + } + return [multipart.contentDisposition.parameters['name'], data]; + }).toList(); + var parts = await Future.wait(values); + var map = {}; + for (var part in parts) { + map[part[0] as String] = part[1]; // Override existing entries. + } + return HttpBody._('form', map); + } + + switch (contentType.primaryType) { + case 'text': + return asText(defaultEncoding); + + case 'application': + switch (contentType.subType) { + case 'json': + var body = await asText(utf8); + return HttpBody._('json', jsonDecode(body.body as String)); + + case 'x-www-form-urlencoded': + var body = await asText(ascii); + var map = Uri.splitQueryString(body.body as String, + encoding: defaultEncoding); + var result = {}; + for (var key in map.keys) { + result[key] = map[key]; + } + return HttpBody._('form', result); + + default: + break; + } + break; + + case 'multipart': + switch (contentType.subType) { + case 'form-data': + return asFormData(); + + default: + break; + } + break; + + default: + break; + } + + return asBinary(); +} diff --git a/common/http_server/lib/src/http_multipart_form_data.dart b/common/http_server/lib/src/http_multipart_form_data.dart new file mode 100644 index 0000000..a81b030 --- /dev/null +++ b/common/http_server/lib/src/http_multipart_form_data.dart @@ -0,0 +1,147 @@ +// Copyright (c) 2012, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; + +import 'package:mime/mime.dart'; + +/// The data in a `multipart/form-data` part. +/// +/// ## Example +/// +/// ```dart +/// HttpServer server = ...; +/// server.listen((request) { +/// var boundary = request.headers.contentType.parameters['boundary']; +/// request +/// .transform(MimeMultipartTransformer(boundary)) +/// .map(HttpMultipartFormData.parse) +/// .map((HttpMultipartFormData formData) { +/// // form data object available here. +/// }); +/// ``` +/// +/// [HttpMultipartFormData] is a Stream, serving either bytes or decoded +/// Strings. Use [isText] or [isBinary] to see what type of data is provided. +class HttpMultipartFormData extends Stream { + /// The parsed `Content-Type` header value. + /// + /// `null` if not present. + final ContentType? contentType; + + /// The parsed `Content-Disposition` header value. + /// + /// This field is always present. Use this to extract e.g. name (form field + /// name) and filename (client provided name of uploaded file) parameters. + final HeaderValue contentDisposition; + + /// The parsed `Content-Transfer-Encoding` header value. + /// + /// This field is used to determine how to decode the data. Returns `null` + /// if not present. + final HeaderValue? contentTransferEncoding; + + /// Whether the data is decoded as [String]. + final bool isText; + + /// Whether the data is raw bytes. + bool get isBinary => !isText; + + /// The values which indicate that no incoding was performed. + /// + /// https://www.w3.org/Protocols/rfc1341/5_Content-Transfer-Encoding.html + static const _transparentEncodings = ['7bit', '8bit', 'binary']; + + /// Parse a [MimeMultipart] and return a [HttpMultipartFormData]. + /// + /// If the `Content-Disposition` header is missing or invalid, an + /// [HttpException] is thrown. + /// + /// If the [MimeMultipart] is identified as text, and the `Content-Type` + /// header is missing, the data is decoded using [defaultEncoding]. See more + /// information in the + /// [HTML5 spec](http://dev.w3.org/html5/spec-preview/ + /// constraints.html#multipart-form-data). + static HttpMultipartFormData parse(MimeMultipart multipart, + {Encoding defaultEncoding = utf8}) { + ContentType? contentType; + HeaderValue? encoding; + HeaderValue? disposition; + for (var key in multipart.headers.keys) { + switch (key) { + case 'content-type': + contentType = ContentType.parse(multipart.headers[key]!); + break; + + case 'content-transfer-encoding': + encoding = HeaderValue.parse(multipart.headers[key]!); + break; + + case 'content-disposition': + disposition = HeaderValue.parse(multipart.headers[key]!, + preserveBackslash: true); + break; + + default: + break; + } + } + if (disposition == null) { + throw const HttpException( + "Mime Multipart doesn't contain a Content-Disposition header value"); + } + if (encoding != null && + !_transparentEncodings.contains(encoding.value.toLowerCase())) { + // TODO(ajohnsen): Support BASE64, etc. + throw HttpException('Unsupported contentTransferEncoding: ' + '${encoding.value}'); + } + + Stream stream = multipart; + var isText = contentType == null || + contentType.primaryType == 'text' || + contentType.mimeType == 'application/json'; + if (isText) { + Encoding? encoding; + if (contentType?.charset != null) { + encoding = Encoding.getByName(contentType!.charset); + } + encoding ??= defaultEncoding; + stream = stream.transform(encoding.decoder); + } + return HttpMultipartFormData._( + contentType, disposition, encoding, multipart, stream, isText); + } + + final MimeMultipart _mimeMultipart; + + final Stream _stream; + + HttpMultipartFormData._( + this.contentType, + this.contentDisposition, + this.contentTransferEncoding, + this._mimeMultipart, + this._stream, + this.isText); + + @override + StreamSubscription listen(void Function(dynamic)? onData, + {void Function()? onDone, Function? onError, bool? cancelOnError}) { + return _stream.listen(onData, + onDone: onDone, onError: onError, cancelOnError: cancelOnError); + } + + /// Returns the value for the header named [name]. + /// + /// If there is no header with the provided name, `null` will be returned. + /// + /// Use this method to index other headers available in the original + /// [MimeMultipart]. + String? value(String name) { + return _mimeMultipart.headers[name]; + } +} diff --git a/common/http_server/lib/src/virtual_directory.dart b/common/http_server/lib/src/virtual_directory.dart new file mode 100644 index 0000000..60ab51f --- /dev/null +++ b/common/http_server/lib/src/virtual_directory.dart @@ -0,0 +1,461 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; + +import 'package:platform_http_server/src/has_current_iterator.dart'; +import 'package:mime/mime.dart'; +import 'package:path/path.dart'; + +// Used for signal a directory redirecting, where a tailing slash is missing. +class _DirectoryRedirect { + const _DirectoryRedirect(); +} + +/// A [VirtualDirectory] can serve files and directory-listing from a root path, +/// to [HttpRequest]s. +/// +/// The [VirtualDirectory] providing secure handling of request uris and +/// file-system links, correct mime-types and custom error pages. +class VirtualDirectory { + final String root; + + /// Whether to allow listing files in a directories. + /// + /// When true the response to a request for a directory will be an HTML + /// document with a table of links to all files within the directory, along + /// with their size and last modified time. The default behavior can be + /// overridden by setting a [directoryHandler]. + bool allowDirectoryListing = false; + + /// Whether to allow reading resources via a link. + bool followLinks = true; + + /// Whether to prevent access outside of [root] via relative paths or links. + bool jailRoot = true; + + final List _pathPrefixSegments; + + final RegExp _invalidPathRegExp = RegExp('[\\/\x00]'); + + void Function(HttpRequest)? _errorCallback; + void Function(Directory, HttpRequest)? _dirCallback; + + static List _parsePathPrefix(String? pathPrefix) { + if (pathPrefix == null) return []; + return Uri(path: pathPrefix) + .pathSegments + .where((segment) => segment.isNotEmpty) + .toList(); + } + + /// Create a new [VirtualDirectory] for serving static file content of the + /// path [root]. + /// + /// The [root] is not required to exist. If the [root] doesn't exist at time of + /// a request, a 404 response is generated. + /// + /// If [pathPrefix] is set, [pathPrefix] will indicate the expected path prefix + /// of incoming requests. When locating the resource on disk, the prefix will + /// be trimmed from the requests uri, before locating the actual resource. + /// If the requests uri doesn't start with [pathPrefix], a 404 response is + /// generated. + VirtualDirectory(this.root, {String? pathPrefix}) + : _pathPrefixSegments = _parsePathPrefix(pathPrefix); + + /// Serve a [Stream] of [HttpRequest]s, in this [VirtualDirectory]. + StreamSubscription serve(Stream requests) => + requests.listen(serveRequest); + + /// Serve a single [HttpRequest], in this [VirtualDirectory]. + Future serveRequest(HttpRequest request) async { + var iterator = HasCurrentIterator(request.uri.pathSegments.iterator); + iterator.moveNext(); + for (var segment in _pathPrefixSegments) { + if (!iterator.hasCurrent || iterator.current != segment) { + _serveErrorPage(HttpStatus.notFound, request); + return request.response.done; + } + iterator.moveNext(); + } + + var entity = await _locateResource('.', iterator); + if (entity is File) { + serveFile(entity, request); + } else if (entity is Directory) { + if (allowDirectoryListing) { + _serveDirectory(entity, request); + } else { + _serveErrorPage(HttpStatus.notFound, request); + } + } else if (entity is _DirectoryRedirect) { + _unawaited(request.response.redirect(Uri.parse('${request.uri}/'), + status: HttpStatus.movedPermanently)); + } else { + assert(entity == null); + _serveErrorPage(HttpStatus.notFound, request); + } + return request.response.done; + } + + /// Overrides the default directory listing. + /// + /// When invoked the [callback] should response through the [HttpRequest] with + /// a directory listing. + set directoryHandler(void Function(Directory, HttpRequest) callback) { + _dirCallback = callback; + } + + /// Overrides the default error handle. + /// + /// When [callback] is invoked, the `statusCode` property of the response is + /// set. + set errorPageHandler(void Function(HttpRequest) callback) { + _errorCallback = callback; + } + + Future _locateResource( + String path, HasCurrentIterator segments) async { + // Don't allow navigating up paths. + if (segments.hasCurrent && segments.current == '..') { + return Future.value(null); + } + path = normalize(path); + // If we jail to root, the relative path can never go up. + if (jailRoot && split(path).first == '..') return Future.value(null); + String fullPath() => join(root, path); + var type = await FileSystemEntity.type(fullPath(), followLinks: false); + switch (type) { + case FileSystemEntityType.file: + if (!segments.hasCurrent) { + return File(fullPath()); + } + break; + + case FileSystemEntityType.directory: + String dirFullPath() => '${fullPath()}$separator'; + if (!segments.hasCurrent) { + if (path == '.') return Directory(dirFullPath()); + return const _DirectoryRedirect(); + } + var current = segments.current; + var hasNext = segments.moveNext(); + if (!hasNext && current == '') { + return Directory(dirFullPath()); + } else { + if (_invalidPathRegExp.hasMatch(current)) break; + return _locateResource(join(path, current), segments); + } + + case FileSystemEntityType.link: + if (followLinks) { + var target = await Link(fullPath()).target(); + var targetPath = normalize(target); + if (isAbsolute(targetPath)) { + // If we jail to root, the path can never be absolute. + if (jailRoot) return null; + return _locateResource(targetPath, segments); + } else { + targetPath = join(dirname(path), targetPath); + return _locateResource(targetPath, segments); + } + } + break; + case FileSystemEntityType.notFound: + break; + default: + break; + } + // Return `null` on fall-through, to indicate NOT_FOUND. + return null; + } + + /// Serve the content of [file] to [request]. + /// + /// Can be used in overrides of [directoryHandler] to redirect to an index + /// file. + /// + /// In the request contains the [HttpHeaders.ifModifiedSince] header, + /// [serveFile] will send a [HttpStatus.notModified] response if the file + /// was not changed. + /// + /// Note that if it was unable to read from [file], the [request]s response + /// is closed with error-code [HttpStatus.notFound]. + void serveFile(File file, HttpRequest request) async { + var response = request.response; + // TODO(ajohnsen): Set up Zone support for these errors. + try { + var lastModified = await file.lastModified(); + if (request.headers.ifModifiedSince != null && + !lastModified.isAfter(request.headers.ifModifiedSince!)) { + response.statusCode = HttpStatus.notModified; + await response.close(); + return null; + } + + response.headers.set(HttpHeaders.lastModifiedHeader, lastModified); + response.headers.set(HttpHeaders.acceptRangesHeader, 'bytes'); + + var length = await file.length(); + var range = request.headers.value(HttpHeaders.rangeHeader); + if (range != null) { + // We only support one range, where the standard support several. + var matches = RegExp(r'^bytes=(\d*)\-(\d*)$').firstMatch(range); + // If the range header have the right format, handle it. + if (matches != null && + (matches[1]!.isNotEmpty || matches[2]!.isNotEmpty)) { + // Serve sub-range. + int start; // First byte position - inclusive. + int end; // Last byte position - inclusive. + if (matches[1]!.isEmpty) { + start = length - int.parse(matches[2]!); + if (start < 0) start = 0; + end = length - 1; + } else { + start = int.parse(matches[1]!); + end = matches[2]!.isEmpty ? length - 1 : int.parse(matches[2]!); + } + // If the range is syntactically invalid the Range header + // MUST be ignored (RFC 2616 section 14.35.1). + if (start <= end) { + if (end >= length) { + end = length - 1; + } + + if (start >= length) { + response.statusCode = HttpStatus.requestedRangeNotSatisfiable; + await response.close(); + return; + } + + // Override Content-Length with the actual bytes sent. + response.headers + .set(HttpHeaders.contentLengthHeader, end - start + 1); + + // Set 'Partial Content' status code. + response + ..statusCode = HttpStatus.partialContent + ..headers.set( + HttpHeaders.contentRangeHeader, 'bytes $start-$end/$length'); + + // Pipe the 'range' of the file. + if (request.method == 'HEAD') { + await response.close(); + } else { + try { + await file + .openRead(start, end + 1) + .cast>() + .pipe(_VirtualDirectoryFileStream(response, file.path)); + } catch (_) { + // TODO(kevmoo): log errors + } + } + return; + } + } + } + + response.headers.set(HttpHeaders.contentLengthHeader, length); + if (request.method == 'HEAD') { + await response.close(); + } else { + try { + await file + .openRead() + .cast>() + .pipe(_VirtualDirectoryFileStream(response, file.path)); + } catch (_) { + // TODO(kevmoo): log errors + } + } + } catch (_) { + response.statusCode = HttpStatus.notFound; + await response.close(); + } + } + + void _serveDirectory(Directory dir, HttpRequest request) async { + if (_dirCallback != null) { + _dirCallback!(dir, request); + return; + } + var response = request.response; + try { + var stats = await dir.stat(); + if (request.headers.ifModifiedSince != null && + !stats.modified.isAfter(request.headers.ifModifiedSince!)) { + response.statusCode = HttpStatus.notModified; + await response.close(); + return; + } + + response.headers.contentType = + ContentType('text', 'html', parameters: {'charset': 'utf-8'}); + response.headers.set(HttpHeaders.lastModifiedHeader, stats.modified); + var path = Uri.decodeComponent(request.uri.path); + var encodedPath = const HtmlEscape().convert(path); + var header = + ''' + + +Index of $encodedPath + + +

Index of $encodedPath

+ + + + + + +'''; + var server = response.headers.value(HttpHeaders.serverHeader); + server ??= ''; + var footer = '''
NameLast modifiedSize
+$server + + +'''; + + response.write(header); + + void add(String name, String? modified, var size, bool folder) { + size ??= '-'; + modified ??= ''; + var encodedSize = const HtmlEscape().convert(size.toString()); + var encodedModified = const HtmlEscape().convert(modified); + var encodedLink = const HtmlEscape(HtmlEscapeMode.attribute) + .convert(Uri.encodeComponent(name)); + if (folder) { + encodedLink += '/'; + name += '/'; + } + var encodedName = const HtmlEscape().convert(name); + + var entry = ''' +
$encodedName + $encodedModified + $encodedSize + '''; + response.write(entry); + } + + if (path != '/') { + add('..', null, null, true); + } + + dir.list(followLinks: true).listen((entity) { + var name = basename(entity.path); + var stat = entity.statSync(); + if (entity is File) { + add(name, stat.modified.toString(), stat.size, false); + } else if (entity is Directory) { + add(name, stat.modified.toString(), null, true); + } + }, onError: (e) { + // TODO(kevmoo): log error + }, onDone: () { + response.write(footer); + response.close(); + }); + } catch (_) { + // TODO(kevmoo): log error + await response.close(); + } + } + + void _serveErrorPage(int error, HttpRequest request) { + var response = request.response; + response.statusCode = error; + if (_errorCallback != null) { + _errorCallback!(request); + return; + } + response.headers.contentType = + ContentType('text', 'html', parameters: {'charset': 'utf-8'}); + // Default error page. + var path = Uri.decodeComponent(request.uri.path); + var encodedPath = const HtmlEscape().convert(path); + var encodedReason = const HtmlEscape().convert(response.reasonPhrase); + var encodedError = const HtmlEscape().convert(error.toString()); + + var server = response.headers.value(HttpHeaders.serverHeader); + server ??= ''; + var page = ''' + + +$encodedReason: $encodedPath + + +

Error $encodedError at '$encodedPath': $encodedReason

+$server + +'''; + response.write(page); + response.close(); + } +} + +class _VirtualDirectoryFileStream implements StreamConsumer> { + final HttpResponse response; + final String path; + List? buffer = []; + + _VirtualDirectoryFileStream(this.response, this.path); + + @override + Future addStream(Stream> stream) { + stream.listen((data) { + if (buffer == null) { + response.add(data); + return; + } + if (buffer!.isEmpty) { + if (data.length >= defaultMagicNumbersMaxLength) { + setMimeType(data); + response.add(data); + buffer = null; + } else { + buffer!.addAll(data); + } + } else { + buffer!.addAll(data); + if (buffer!.length >= defaultMagicNumbersMaxLength) { + setMimeType(buffer); + response.add(buffer!); + buffer = null; + } + } + }, onDone: () { + if (buffer != null) { + if (buffer!.isEmpty) { + setMimeType(null); + } else { + setMimeType(buffer); + response.add(buffer!); + } + } + response.close(); + }, onError: response.addError); + return response.done; + } + + @override + Future close() => Future.value(); + + void setMimeType(List? bytes) { + var mimeType = lookupMimeType(path, headerBytes: bytes); + if (mimeType != null) { + response.headers.contentType = ContentType.parse(mimeType); + } + } +} + +// Copied from `package:pedantic` to avoid the dep. +void _unawaited(Future f) {} diff --git a/common/http_server/lib/src/virtual_host.dart b/common/http_server/lib/src/virtual_host.dart new file mode 100644 index 0000000..1f16c26 --- /dev/null +++ b/common/http_server/lib/src/virtual_host.dart @@ -0,0 +1,141 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:io'; + +/// A utility for handling multiple hosts on multiple sources using a +/// named-based approach. +abstract class VirtualHost { + /// The [Stream] of [HttpRequest] not matching any hosts. + /// + /// If this field is not read the default implementation will result in a + /// [HttpStatus.forbidden] response. + Stream get unhandled; + + /// Construct an [VirtualHost] with an optional initial source. + /// + /// The optional [source] is a shortcut for calling [addSource]. + /// + /// ## Example + /// + /// ```dart + /// var server = await HttpServer.bind(..., 80); + /// var virtualHost = VirtualHost(server); + /// virtualServer.addHost('static.myserver.com').listen(...); + /// virtualServer.addHost('cache.myserver.com').listen(...); + /// ``` + factory VirtualHost([Stream? source]) => _VirtualHost(source); + + /// Provide another source of [HttpRequest]s in the form of a [Stream]. + void addSource(Stream source); + + /// Add a host to the [VirtualHost] instance. + /// + /// The host can be either a specific domain (`my.domain.name`) or a + /// wildcard-based domain name (`*.domain.name`). The former will only match + /// the specific domain name while the latter will match any series of + /// sub-domains. + /// + /// If both `my.domain.name` and `*.domain.name` is specified, the most + /// qualified will take precedence, `my.domain.name` in this case. + Stream addHost(String host); +} + +class _VirtualHostDomain { + StreamController? any; + StreamController? exact; + Map subDomains = {}; +} + +class _VirtualHost implements VirtualHost { + final _VirtualHostDomain _topDomain = _VirtualHostDomain(); + StreamController? _unhandledController; + + @override + Stream get unhandled { + _unhandledController ??= StreamController(); + + return _unhandledController!.stream; + } + + _VirtualHost([Stream? source]) { + if (source != null) addSource(source); + } + + @override + void addSource(Stream source) { + source.listen((request) { + var host = request.headers.host; + if (host == null) { + _unhandled(request); + return; + } + var domains = host.split('.'); + var current = _topDomain; + StreamController? any; + for (var i = domains.length - 1; i >= 0; i--) { + if (current.any != null) any = current.any; + if (i == 0) { + var last = current.subDomains[domains[i]]; + if (last != null && last.exact != null) { + last.exact!.add(request); + return; + } + } else { + if (!current.subDomains.containsKey(domains[i])) { + break; + } + current = current.subDomains[domains[i]]!; + } + } + if (any != null) { + any.add(request); + return; + } + _unhandled(request); + }); + } + + @override + Stream addHost(String host) { + if (host.lastIndexOf('*') > 0) { + throw ArgumentError( + 'Wildcards are only allowed in the beginning of a host'); + } + var controller = StreamController(); + var domains = host.split('.'); + var current = _topDomain; + for (var i = domains.length - 1; i >= 0; i--) { + if (domains[i] == '*') { + if (current.any != null) { + throw ArgumentError('Host is already provided'); + } + current.any = controller; + } else { + if (!current.subDomains.containsKey(domains[i])) { + current.subDomains[domains[i]] = _VirtualHostDomain(); + } + if (i > 0) { + current = current.subDomains[domains[i]]!; + } else { + if (current.subDomains[domains[i]]!.exact != null) { + throw ArgumentError('Host is already provided'); + } + current.subDomains[domains[i]]!.exact = controller; + } + } + } + return controller.stream; + } + + void _unhandled(HttpRequest request) { + if (_unhandledController != null) { + _unhandledController!.add(request); + return; + } + request.response.statusCode = HttpStatus.forbidden; + request.response.close(); + } +} diff --git a/common/http_server/pubspec.yaml b/common/http_server/pubspec.yaml new file mode 100644 index 0000000..5b129c2 --- /dev/null +++ b/common/http_server/pubspec.yaml @@ -0,0 +1,14 @@ +name: platform_http_server +version: 4.5.1 +description: A collection of useful utility classes for building HTTP server. +repository: https://github.com/dart-backend/platform_http_server +homepage: https://github.com/dart-backend/platform_http_server +environment: + sdk: '>=3.5.0 <4.0.0' +dependencies: + mime: ^2.0.0 + path: ^1.9.0 +dev_dependencies: + lints: ^5.0.0 + test: ^1.25.0 + test_api: ^0.7.0 diff --git a/common/http_server/test/certificates/server_chain.pem b/common/http_server/test/certificates/server_chain.pem new file mode 100644 index 0000000..341a86f --- /dev/null +++ b/common/http_server/test/certificates/server_chain.pem @@ -0,0 +1,59 @@ +-----BEGIN CERTIFICATE----- +MIIDZDCCAkygAwIBAgIBATANBgkqhkiG9w0BAQsFADAgMR4wHAYDVQQDDBVpbnRl +cm1lZGlhdGVhdXRob3JpdHkwHhcNMTUxMDI3MTAyNjM1WhcNMjUxMDI0MTAyNjM1 +WjAUMRIwEAYDVQQDDAlsb2NhbGhvc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAw +ggEKAoIBAQCkg/Qr8RQeLTOSgCkyiEX2ztgkgscX8hKGHEHdvlkmVK3JVEIIwkvu +/Y9LtHZUia3nPAgqEEbexzTENZjSCcC0V6I2XW/e5tIE3rO0KLZyhtZhN/2SfJ6p +KbOh0HLr1VtkKJGp1tzUmHW/aZI32pK60ZJ/N917NLPCJpCaL8+wHo3+w3oNqln6 +oJsfgxy9SUM8Bsc9WMYKMUdqLO1QKs1A5YwqZuO7Mwj+4LY2QDixC7Ua7V9YAPo2 +1SBeLvMCHbYxSPCuxcZ/kDkgax/DF9u7aZnGhMImkwBka0OQFvpfjKtTIuoobTpe +PAG7MQYXk4RjnjdyEX/9XAQzvNo1CDObAgMBAAGjgbQwgbEwPAYDVR0RBDUwM4IJ +bG9jYWxob3N0ggkxMjcuMC4wLjGCAzo6MYcEfwAAAYcQAAAAAAAAAAAAAAAAAAAA +ATAMBgNVHRMBAf8EAjAAMB0GA1UdDgQWBBSvhJo6taTggJQBukEvMo/PDk8tKTAf +BgNVHSMEGDAWgBS98L4T5RaIToE3DkBRsoeWPil0eDAOBgNVHQ8BAf8EBAMCA6gw +EwYDVR0lBAwwCgYIKwYBBQUHAwEwDQYJKoZIhvcNAQELBQADggEBAHLOt0mL2S4A +B7vN7KsfQeGlVgZUVlEjem6kqBh4fIzl4CsQuOO8oJ0FlO1z5JAIo98hZinymJx1 +phBVpyGIKakT/etMH0op5evLe9dD36VA3IM/FEv5ibk35iGnPokiJXIAcdHd1zam +YaTHRAnZET5S03+7BgRTKoRuszhbvuFz/vKXaIAnVNOF4Gf2NUJ/Ax7ssJtRkN+5 +UVxe8TZVxzgiRv1uF6NTr+J8PDepkHCbJ6zEQNudcFKAuC56DN1vUe06gRDrNbVq +2JHEh4pRfMpdsPCrS5YHBjVq/XHtFHgwDR6g0WTwSUJvDeM4OPQY5f61FB0JbFza +PkLkXmoIod8= +-----END CERTIFICATE----- +-----BEGIN CERTIFICATE----- +MIIDLjCCAhagAwIBAgIBAjANBgkqhkiG9w0BAQsFADAYMRYwFAYDVQQDDA1yb290 +YXV0aG9yaXR5MB4XDTE1MTAyNzEwMjYzNVoXDTI1MTAyNDEwMjYzNVowIDEeMBwG +A1UEAwwVaW50ZXJtZWRpYXRlYXV0aG9yaXR5MIIBIjANBgkqhkiG9w0BAQEFAAOC +AQ8AMIIBCgKCAQEA6GndRFiXk+2q+Ig7ZOWKKGta+is8137qyXz+eVFs5sA0ajMN +ZBAMWS0TIXw/Yks+y6fEcV/tfv91k1eUN4YXPcoxTdDF97d2hO9wxumeYOMnQeDy +VZVDKQBZ+jFMeI+VkNpMEdmsLErpZDGob/1dC8tLEuR6RuRR8X6IDGMPOCMw1jLK +V1bQjPtzqKadTscfjLuKxuLgspJdTrzsu6hdcl1mm8K6CjTY2HNXWxs1yYmwfuQ2 +Z4/8sOMNqFqLjN+ChD7pksTMq7IosqGiJzi2bpd5f44ek/k822Y0ATncJHk4h1Z+ +kZBnW6kgcLna1gDri9heRwSZ+M8T8nlHgIMZIQIDAQABo3sweTASBgNVHRMBAf8E +CDAGAQH/AgEAMB0GA1UdDgQWBBS98L4T5RaIToE3DkBRsoeWPil0eDAfBgNVHSME +GDAWgBRxD5DQHTmtpDFKDOiMf5FAi6vfbzAOBgNVHQ8BAf8EBAMCAgQwEwYDVR0l +BAwwCgYIKwYBBQUHAwEwDQYJKoZIhvcNAQELBQADggEBAD+4KpUeV5mUPw5IG/7w +eOXnUpeS96XFGuS1JuFo/TbgntPWSPyo+rD4GrPIkUXyoHaMCDd2UBEjyGbBIKlB +NZA3RJOAEp7DTkLNK4RFn/OEcLwG0J5brL7kaLRO4vwvItVIdZ2XIqzypRQTc0MG +MmF08zycnSlaN01ryM67AsMhwdHqVa+uXQPo8R8sdFGnZ33yywTYD73FeImXilQ2 +rDnFUVqmrW1fjl0Fi4rV5XI0EQiPrzKvRtmF8ZqjGATPOsRd64cwQX6V+P5hNeIR +9pba6td7AbNGausHfacRYMyoGJWWWkFPd+7jWOCPqW7Fk1tmBgdB8GzXa3inWIRM +RUE= +-----END CERTIFICATE----- +-----BEGIN CERTIFICATE----- +MIIC+zCCAeOgAwIBAgIBATANBgkqhkiG9w0BAQsFADAYMRYwFAYDVQQDDA1yb290 +YXV0aG9yaXR5MB4XDTE1MTAyNzEwMjYzNFoXDTI1MTAyNDEwMjYzNFowGDEWMBQG +A1UEAwwNcm9vdGF1dGhvcml0eTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoC +ggEBAMl+dcraUM/E7E6zl7+7hK9oUJYXJLnfiMtP/TRFVbH4+2aEN8vXzPbzKdR3 +FfaHczXQTwnTCaYA4u4uSDvSOsFFEfxEwYORsdKmQEM8nGpVX2NVvKsMcGIhh8kh +ZwJfkMIOcAxmGIHGdMhF8VghonJ8uGiuqktxdfpARq0g3fqIjDHsF9/LpfshUfk9 +wsRyTF0yr90U/dsfnE+u8l7GvVl8j2Zegp0sagAGtLaNv7tP17AibqEGg2yDBrBN +9r9ihe4CqMjx+Q2kQ2S9Gz2V2ReO/n6vm2VQxsPRB/lV/9jh7cUcS0/9mggLYrDy +cq1v7rLLQrWuxMz1E3gOhyCYJ38CAwEAAaNQME4wHQYDVR0OBBYEFHEPkNAdOa2k +MUoM6Ix/kUCLq99vMB8GA1UdIwQYMBaAFHEPkNAdOa2kMUoM6Ix/kUCLq99vMAwG +A1UdEwQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBABrhjnWC6b+z9Kw73C/niOwo +9sPdufjS6tb0sCwDjt3mjvE4NdNWt+/+ZOugW6dqtvqhtqZM1q0u9pJkNwIrqgFD +ZHcfNaf31G6Z2YE+Io7woTVw6fFobg/EFo+a/qwbvWL26McmiRL5yiSBjVjpX4a5 +kdZ+aPQUCBaLrTWwlCDqzSVIULWUQvveRWbToMFKPNID58NtEpymAx3Pgir7YjV9 +UnlU2l5vZrh1PTCqZxvC/IdRESUfW80LdHaeyizRUP+6vKxGgSz2MRuYINjbd6GO +hGiCpWlwziW2xLV1l2qSRLko2kIafLZP18N0ThM9zKbU5ps9NgFOf//wqSGtLaE= +-----END CERTIFICATE----- diff --git a/common/http_server/test/certificates/server_key.pem b/common/http_server/test/certificates/server_key.pem new file mode 100644 index 0000000..895b7d2 --- /dev/null +++ b/common/http_server/test/certificates/server_key.pem @@ -0,0 +1,29 @@ +-----BEGIN ENCRYPTED PRIVATE KEY----- +MIIE4zAcBgoqhkiG9w0BDAEBMA4ECBMCjlg8JYZ4AgIIAASCBMFd9cBoZ5xcTock +AVQcg/HzYJtMceKn1gtMDdC7mmXuyN0shoxhG4BpQInHkFARL+nenesXFxEm4X5e +L603Pcgw72/ratxVpTW7hPMjiLTEBqza0GjQm7Sarbdy+Vzdp/6XFrAcPfFl1juY +oyYzbozPsvFHz3Re44y1KmI4HAzU/qkjJUbNTTiPPVI2cDP6iYN2XXxBb1wwp8jR +iqdZqFG7lU/wvPEbD7BVPpmJBHWNG681zb4ea5Zn4hW8UaxpiIBiaH0/IWc2SVZd +RliAFo3NEsGxCcsnBo/n00oudGbOJxdOp7FbH5hJpeqX2WhCyJRxIeHOWmeuMAet +03HFriiEmJ99m2nEJN1x0A3QUUM7ji6vZAb4qb1dyq7LlX4M2aaqixRnaTcQkapf +DOxX35DEBXSKrDpyWp6Rx4wNpUyi1TKyhaVnYgD3Gn0VfC/2w86gSFlrf9PMYGM0 +PvFxTDzTyjOuPBRa728gZOGXgDOL7qvdInU/opVew7kFeRQHXxHzFCLK5dD+Vrig +5fS3m0++f55ODkxqHXB8gbXbd3GMmsW6MrGpU7VsCNtbVPdSMW0FalovEB0M+2lj +1VfuvL+0F5huTe+BgZAt6xgET/CIcZXdNMRPVhraqUjqWtI9Rdk4STPCpU1rDkjG +YDl/fo4W2T6qQWFUpiC9IvVVGkVxaqfZZ4Qu+V5xPUi6vk95QiTNkN1t+m+sCCgS +Llkea8Um0aHMy33Lj3NsfL0LMrnpniqcAks8BvcgIZwk1VRqcj7BQVCygJSYrmAR +DBhMpjWlXuSggnyVPuduZDtnTN+8lCHLOKL3a3bDb6ySaKX49Km6GutDLfpDtEA0 +3mQvmEG4XVm7zy+AlN72qFbtSLDRi/D/uQh2q/ZrFQLOBQBQB56TvEbKouLimUDM +ascQA3aUyhOE7e+d02NOFIFTozwc/C//CIFeA+ZEwxyfha/3Bor6Jez7PC/eHNxZ +w7YMXzPW9NhcCcerhYGebuCJxLwzqJ+IGdukjKsGV2ytWDoB2xZiJNu096j4RKcq +YSJoen0R7IH8N4eDujXR8m9kAl724Uqs1OoAs4VNICvzTutbsgVZ6Z+NMOcfnPw9 +jZkFhot16w8znD+OmhBR7/bzOLpaeUhk7EhNq5M6U0NNWx3WwkDlvU/jx+6/EQe3 +iLEHptH2HYBF1xscaKGbtKNtuQsfdzgWpOX0qK2YbK3yCKvL/xIm1DQmDZDKkWdW +VNh8oGV1H96CivWlvxhAgXKz9F/83CjMw8YXRk7RJvWR4vtNvXFAvGkFIYCN9Jv9 +p+1ukaYoxSLGBik907I6gWSHqumJiCprUyAX/bVfZfNiYh4hzeA3lhwxZSax3JG4 +7QFPvyepOmF/3AAzS/Pusx6jOZnuCMCkfQi6Wpem1o3s4x+fP7kz00Xuj01ErucM +S10ixfIh84kXBN3dTRDtDdeCyoMsBKO0W5jDBBlWL02YfdF6Opo1Q4cPh2DYgXMh +XEszNZSK5LB0y+f3A6Kdx/hkZzHVvMONA70OyrkoZzGyWENhcB0c7ntTJyPPD2qM +s0HRA2VwF/0ypU3OKERM1Ua5NSkTgvnnVTlV9GO90Tkn5v4fxdl8NzIuJLyGguTP +Xc0tRM34Lg== +-----END ENCRYPTED PRIVATE KEY----- diff --git a/common/http_server/test/certificates/trusted_certs.pem b/common/http_server/test/certificates/trusted_certs.pem new file mode 100644 index 0000000..8b5bf3e --- /dev/null +++ b/common/http_server/test/certificates/trusted_certs.pem @@ -0,0 +1,18 @@ +-----BEGIN CERTIFICATE----- +MIIC+zCCAeOgAwIBAgIBATANBgkqhkiG9w0BAQsFADAYMRYwFAYDVQQDDA1yb290 +YXV0aG9yaXR5MB4XDTE1MTAyNzEwMjYzNFoXDTI1MTAyNDEwMjYzNFowGDEWMBQG +A1UEAwwNcm9vdGF1dGhvcml0eTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoC +ggEBAMl+dcraUM/E7E6zl7+7hK9oUJYXJLnfiMtP/TRFVbH4+2aEN8vXzPbzKdR3 +FfaHczXQTwnTCaYA4u4uSDvSOsFFEfxEwYORsdKmQEM8nGpVX2NVvKsMcGIhh8kh +ZwJfkMIOcAxmGIHGdMhF8VghonJ8uGiuqktxdfpARq0g3fqIjDHsF9/LpfshUfk9 +wsRyTF0yr90U/dsfnE+u8l7GvVl8j2Zegp0sagAGtLaNv7tP17AibqEGg2yDBrBN +9r9ihe4CqMjx+Q2kQ2S9Gz2V2ReO/n6vm2VQxsPRB/lV/9jh7cUcS0/9mggLYrDy +cq1v7rLLQrWuxMz1E3gOhyCYJ38CAwEAAaNQME4wHQYDVR0OBBYEFHEPkNAdOa2k +MUoM6Ix/kUCLq99vMB8GA1UdIwQYMBaAFHEPkNAdOa2kMUoM6Ix/kUCLq99vMAwG +A1UdEwQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBABrhjnWC6b+z9Kw73C/niOwo +9sPdufjS6tb0sCwDjt3mjvE4NdNWt+/+ZOugW6dqtvqhtqZM1q0u9pJkNwIrqgFD +ZHcfNaf31G6Z2YE+Io7woTVw6fFobg/EFo+a/qwbvWL26McmiRL5yiSBjVjpX4a5 +kdZ+aPQUCBaLrTWwlCDqzSVIULWUQvveRWbToMFKPNID58NtEpymAx3Pgir7YjV9 +UnlU2l5vZrh1PTCqZxvC/IdRESUfW80LdHaeyizRUP+6vKxGgSz2MRuYINjbd6GO +hGiCpWlwziW2xLV1l2qSRLko2kIafLZP18N0ThM9zKbU5ps9NgFOf//wqSGtLaE= +-----END CERTIFICATE----- diff --git a/common/http_server/test/has_current_iterator_test.dart b/common/http_server/test/has_current_iterator_test.dart new file mode 100644 index 0000000..ac171b1 --- /dev/null +++ b/common/http_server/test/has_current_iterator_test.dart @@ -0,0 +1,72 @@ +import 'package:platform_http_server/src/has_current_iterator.dart'; +import 'package:test/test.dart'; + +void main() { + const mockFirstItem = 'foo'; + const mockLastItem = 'bar'; + const mockItems = [mockFirstItem, mockLastItem]; + group('When testing hasCurrent', () { + test('should return false to start', () { + final hasCurrentIterator = HasCurrentIterator(mockItems.iterator); + expect(hasCurrentIterator.hasCurrent, isFalse); + }); + + group('With a single item list', () { + const mockSingleList = [mockFirstItem]; + test('Should return true.', () { + final hasCurrentIterator = HasCurrentIterator(mockSingleList.iterator); + hasCurrentIterator.moveNext(); + expect(hasCurrentIterator.hasCurrent, isTrue); + }); + }); + + group('with an empty list', () { + test('should return false.', () { + final hasCurrentIterator = HasCurrentIterator([].iterator); + hasCurrentIterator.moveNext(); + expect(hasCurrentIterator.hasCurrent, isFalse); + }); + }); + + group('when iterating beyond the end of the list', () { + test('should return false', () { + final hasCurrentIterator = HasCurrentIterator(mockItems.iterator); + hasCurrentIterator.moveNext(); + hasCurrentIterator.moveNext(); + hasCurrentIterator.moveNext(); + expect(hasCurrentIterator.hasCurrent, isFalse); + }); + }); + }); + + group('When testing current', () { + test('should return current item', () { + final hasCurrentIterator = HasCurrentIterator(mockItems.iterator); + hasCurrentIterator.moveNext(); + expect( + hasCurrentIterator.current, allOf(isNotEmpty, equals(mockFirstItem))); + }); + + test('should return last item item', () { + final hasCurrentIterator = HasCurrentIterator(mockItems.iterator); + hasCurrentIterator.moveNext(); + hasCurrentIterator.moveNext(); + expect( + hasCurrentIterator.current, allOf(isNotEmpty, equals(mockLastItem))); + }); + }); + + group('When testing moveNext', () { + test('should return true', () { + final hasCurrentIterator = HasCurrentIterator(mockItems.iterator); + expect(hasCurrentIterator.moveNext(), isTrue); + }); + + test('should return false', () { + final hasCurrentIterator = HasCurrentIterator(mockItems.iterator); + hasCurrentIterator.moveNext(); + hasCurrentIterator.moveNext(); + expect(hasCurrentIterator.moveNext(), isFalse); + }); + }); +} diff --git a/common/http_server/test/http_body_test.dart b/common/http_server/test/http_body_test.dart new file mode 100644 index 0000000..fa8f7ac --- /dev/null +++ b/common/http_server/test/http_body_test.dart @@ -0,0 +1,319 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; +import 'dart:typed_data'; + +import 'package:platform_http_server/http_server.dart'; +import 'package:test/test.dart'; + +import 'http_fakes.dart'; + +void _testHttpClientResponseBody() { + void check( + String mimeType, List content, dynamic expectedBody, String type, + [bool shouldFail = false]) async { + var server = await HttpServer.bind('localhost', 0); + server.listen((request) { + request.listen((_) {}, onDone: () { + request.response.headers.contentType = ContentType.parse(mimeType); + request.response.add(content); + request.response.close(); + }); + }); + + var client = HttpClient(); + try { + var request = await client.get('localhost', server.port, '/'); + var response = await request.close(); + var body = await HttpBodyHandler.processResponse(response); + expect(shouldFail, isFalse); + expect(body.type, equals(type)); + expect(body.response, isNotNull); + switch (type) { + case 'text': + case 'json': + expect(body.body, equals(expectedBody)); + break; + + default: + fail('bad body type'); + } + } catch (_) { + if (!shouldFail) rethrow; + } finally { + client.close(); + await server.close(); + } + } + + check('text/plain', 'body'.codeUnits, 'body', 'text'); + check('text/plain; charset=utf-8', 'body'.codeUnits, 'body', 'text'); + check('text/plain; charset=iso-8859-1', 'body'.codeUnits, 'body', 'text'); + check('text/plain; charset=us-ascii', 'body'.codeUnits, 'body', 'text'); + check('text/plain; charset=utf-8', [42], '*', 'text'); + check('text/plain; charset=us-ascii', [142], null, 'text', true); + check('text/plain; charset=utf-8', [142], null, 'text', true); + + check('application/json', '{"val": 5}'.codeUnits, {'val': 5}, 'json'); + check('application/json', '{ bad json }'.codeUnits, null, 'json', true); +} + +void _testHttpServerRequestBody() { + void check( + String? mimeType, List content, dynamic expectedBody, String type, + {bool shouldFail = false, Encoding defaultEncoding = utf8}) async { + var server = await HttpServer.bind('localhost', 0); + server.transform(HttpBodyHandler(defaultEncoding: defaultEncoding)).listen( + (body) { + if (shouldFail) return; + expect(shouldFail, isFalse); + expect(body.type, equals(type)); + switch (type) { + case 'text': + expect( + body.request.headers.contentType!.mimeType, equals('text/plain')); + expect(body.body, equals(expectedBody)); + break; + + case 'json': + expect(body.request.headers.contentType!.mimeType, + equals('application/json')); + expect(body.body, equals(expectedBody)); + break; + + case 'binary': + expect(body.request.headers.contentType, isNull); + expect(body.body, equals(expectedBody)); + break; + + case 'form': + var mimeType = body.request.headers.contentType!.mimeType; + expect( + mimeType, + anyOf(equals('multipart/form-data'), + equals('application/x-www-form-urlencoded'))); + expect(body.body.keys.toSet(), equals(expectedBody.keys.toSet())); + for (var key in expectedBody.keys) { + var found = body.body[key]; + var expected = expectedBody[key]; + if (found is HttpBodyFileUpload) { + expect(found.contentType.toString(), + equals(expected['contentType'])); + expect(found.filename, equals(expected['filename'])); + expect(found.content, equals(expected['content'])); + } else { + expect(found, equals(expected)); + } + } + break; + + default: + throw StateError('bad body type'); + } + body.request.response.close(); + }, onError: (Object error) { + // ignore: only_throw_errors + if (!shouldFail) throw error; + }); + + var client = HttpClient(); + try { + var request = await client.post('localhost', server.port, '/'); + if (mimeType != null) { + request.headers.contentType = ContentType.parse(mimeType); + } + request.add(content); + var response = await request.close(); + if (shouldFail) { + expect(response.statusCode, equals(HttpStatus.badRequest)); + } + return response.drain(); + } catch (_) { + if (!shouldFail) rethrow; + } finally { + client.close(); + await server.close(); + } + } + + check('text/plain', 'body'.codeUnits, 'body', 'text'); + check('text/plain; charset=utf-8', 'body'.codeUnits, 'body', 'text'); + check('text/plain; charset=utf-8', [42], '*', 'text'); + check('text/plain; charset=us-ascii', [142], null, 'text', shouldFail: true); + check('text/plain; charset=utf-8', [142], null, 'text', shouldFail: true); + + check('application/json', '{"val": 5}'.codeUnits, {'val': 5}, 'json'); + check('application/json', '{ bad json }'.codeUnits, null, 'json', + shouldFail: true); + + check(null, 'body'.codeUnits, 'body'.codeUnits, 'binary'); + + check( + 'multipart/form-data; boundary=AaB03x', + ''' +--AaB03x\r +Content-Disposition: form-data; name="name"\r +\r +Larry\r +--AaB03x--\r\n''' + .codeUnits, + {'name': 'Larry'}, + 'form'); + + check( + 'multipart/form-data; boundary=AaB03x', + ''' +--AaB03x\r +Content-Disposition: form-data; name="files"; filename="myfile"\r +Content-Type: application/octet-stream\r +\r +File content\r +--AaB03x--\r\n''' + .codeUnits, + { + 'files': { + 'filename': 'myfile', + 'contentType': 'application/octet-stream', + 'content': 'File content'.codeUnits + } + }, + 'form'); + + check( + 'multipart/form-data; boundary=AaB03x', + ''' +--AaB03x\r +Content-Disposition: form-data; name="files"; filename="myfile"\r +Content-Type: application/octet-stream\r +\r +File content\r +--AaB03x\r +Content-Disposition: form-data; name="files"; filename="myfile"\r +Content-Type: text/plain\r +\r +File content\r +--AaB03x--\r\n''' + .codeUnits, + { + 'files': { + 'filename': 'myfile', + 'contentType': 'text/plain', + 'content': 'File content' + } + }, + 'form'); + + check( + 'multipart/form-data; boundary=AaB03x', + ''' +--AaB03x\r +Content-Disposition: form-data; name="files"; filename="myfile"\r +Content-Type: application/json\r +\r +File content\r +--AaB03x--\r\n''' + .codeUnits, + { + 'files': { + 'filename': 'myfile', + 'contentType': 'application/json', + 'content': 'File content' + } + }, + 'form'); + + check( + 'application/x-www-form-urlencoded', + '%E5%B9%B3%3D%E4%BB%AE%E5%90%8D=%E5%B9%B3%E4%BB%AE%E5%90%8D&b' + '=%E5%B9%B3%E4%BB%AE%E5%90%8D' + .codeUnits, + {'平=仮名': '平仮名', 'b': '平仮名'}, + 'form'); + + check('application/x-www-form-urlencoded', 'a=%F8+%26%23548%3B'.codeUnits, + null, 'form', + shouldFail: true); + + check('application/x-www-form-urlencoded', 'a=%C0%A0'.codeUnits, null, 'form', + shouldFail: true); + + check('application/x-www-form-urlencoded', 'a=x%A0x'.codeUnits, null, 'form', + shouldFail: true); + + check('application/x-www-form-urlencoded', 'a=x%C0x'.codeUnits, null, 'form', + shouldFail: true); + + check('application/x-www-form-urlencoded', 'a=%C3%B8+%C8%A4'.codeUnits, + {'a': 'ø Ȥ'}, 'form'); + + check('application/x-www-form-urlencoded', 'a=%F8+%26%23548%3B'.codeUnits, + {'a': 'ø Ȥ'}, 'form', + defaultEncoding: latin1); + + check('application/x-www-form-urlencoded', 'name=%26'.codeUnits, + {'name': '&'}, 'form', + defaultEncoding: latin1); + + check('application/x-www-form-urlencoded', 'name=%F8%26'.codeUnits, + {'name': 'ø&'}, 'form', + defaultEncoding: latin1); + + check('application/x-www-form-urlencoded', 'name=%26%3B'.codeUnits, + {'name': '&;'}, 'form', + defaultEncoding: latin1); + + check( + 'application/x-www-form-urlencoded', + 'name=%26%23548%3B%26%23548%3B'.codeUnits, + {'name': 'ȤȤ'}, + 'form', + defaultEncoding: latin1); + + check('application/x-www-form-urlencoded', 'name=%26'.codeUnits, + {'name': '&'}, 'form'); + + check('application/x-www-form-urlencoded', 'name=%C3%B8%26'.codeUnits, + {'name': 'ø&'}, 'form'); + + check('application/x-www-form-urlencoded', 'name=%26%3B'.codeUnits, + {'name': '&;'}, 'form'); + + check('application/x-www-form-urlencoded', + 'name=%C8%A4%26%23548%3B'.codeUnits, {'name': 'ȤȤ'}, 'form'); + + check('application/x-www-form-urlencoded', 'name=%C8%A4%C8%A4'.codeUnits, + {'name': 'ȤȤ'}, 'form'); +} + +void main() { + test('client response body', _testHttpClientResponseBody); + test('server request body', _testHttpServerRequestBody); + + test('Does not close stream while requests are pending', () async { + var data = StreamController(); + var requests = Stream.fromIterable( + [FakeHttpRequest(Uri(), data: data.stream)]); + var isDone = false; + requests + .transform(HttpBodyHandler()) + .listen((_) {}, onDone: () => isDone = true); + await pumpEventQueue(); + expect(isDone, isFalse); + await data.close(); + expect(isDone, isTrue); + }); + + test('Closes stream while no requests are pending', () async { + var requests = Stream.empty(); + var isDone = false; + requests + .transform(HttpBodyHandler()) + .listen((_) {}, onDone: () => isDone = true); + await pumpEventQueue(); + expect(isDone, isTrue); + }); +} diff --git a/common/http_server/test/http_fakes.dart b/common/http_server/test/http_fakes.dart new file mode 100644 index 0000000..b1aee8b --- /dev/null +++ b/common/http_server/test/http_fakes.dart @@ -0,0 +1,240 @@ +import 'dart:async'; +import 'dart:collection'; +import 'dart:convert'; +import 'dart:io'; +import 'dart:typed_data'; + +class FakeHttpHeaders implements HttpHeaders { + final Map> _headers = HashMap>(); + + @override + List? operator [](key) => _headers[key]; + + @override + int get contentLength => + int.parse(_headers[HttpHeaders.contentLengthHeader]![0]); + + @override + DateTime? get ifModifiedSince { + var values = _headers[HttpHeaders.ifModifiedSinceHeader]; + if (values != null) { + try { + return HttpDate.parse(values[0]); + } on Exception { + return null; + } + } + return null; + } + + @override + set ifModifiedSince(DateTime? ifModifiedSince) { + ArgumentError.checkNotNull(ifModifiedSince); + // Format "ifModifiedSince" header with date in Greenwich Mean Time (GMT). + var formatted = HttpDate.format(ifModifiedSince!.toUtc()); + _set(HttpHeaders.ifModifiedSinceHeader, formatted); + } + + @override + ContentType? contentType; + + @override + void set(String name, Object value, {bool preserveHeaderCase = false}) { + if (preserveHeaderCase) { + throw ArgumentError('preserveHeaderCase not supported'); + } + name = name.toLowerCase(); + _headers.remove(name); + _addAll(name, value); + } + + @override + String? value(String name) { + name = name.toLowerCase(); + var values = _headers[name]; + if (values == null) return null; + if (values.length > 1) { + throw HttpException('More than one value for header $name'); + } + return values[0]; + } + + @override + String toString() => '$runtimeType : $_headers'; + + // [name] must be a lower-case version of the name. + void _add(String name, Object value) { + if (name == HttpHeaders.ifModifiedSinceHeader) { + if (value is DateTime) { + ifModifiedSince = value; + } else if (value is String) { + _set(HttpHeaders.ifModifiedSinceHeader, value); + } else { + throw HttpException('Unexpected type for header named $name'); + } + } else { + _addValue(name, value); + } + } + + void _addAll(String name, Object value) { + if (value is List) { + for (var i = 0; i < value.length; i++) { + _add(name, value[i] as Object); + } + } else { + _add(name, value); + } + } + + void _addValue(String name, Object value) { + var values = _headers[name]; + if (values == null) { + values = []; + _headers[name] = values; + } + if (value is DateTime) { + values.add(HttpDate.format(value)); + } else { + values.add(value.toString()); + } + } + + void _set(String name, String value) { + assert(name == name.toLowerCase()); + var values = []; + _headers[name] = values; + values.add(value); + } + + /* + * Implemented to remove editor warnings + */ + @override + dynamic noSuchMethod(Invocation invocation) { + print([ + invocation.memberName, + invocation.isGetter, + invocation.isSetter, + invocation.isMethod, + invocation.isAccessor + ]); + return super.noSuchMethod(invocation); + } +} + +class FakeHttpRequest extends StreamView implements HttpRequest { + @override + final Uri uri; + @override + final FakeHttpResponse response = FakeHttpResponse(); + @override + final HttpHeaders headers = FakeHttpHeaders(); + @override + final String method = 'GET'; + final bool followRedirects; + + FakeHttpRequest(this.uri, + {this.followRedirects = true, + DateTime? ifModifiedSince, + required Stream data}) + : super(data) { + if (ifModifiedSince != null) { + headers.ifModifiedSince = ifModifiedSince; + } + } + + /* + * Implemented to remove editor warnings + */ + @override + dynamic noSuchMethod(Invocation invocation) => super.noSuchMethod(invocation); +} + +class FakeHttpResponse implements HttpResponse { + @override + final HttpHeaders headers = FakeHttpHeaders(); + final Completer _completer = Completer(); + final List _buffer = []; + String? _reasonPhrase; + late final Future _doneFuture; + + FakeHttpResponse() { + _doneFuture = _completer.future.whenComplete(() { + assert(!_isDone); + _isDone = true; + }); + } + + bool _isDone = false; + + @override + int statusCode = HttpStatus.ok; + + @override + String get reasonPhrase => _findReasonPhrase(statusCode)!; + + @override + set reasonPhrase(String value) { + _reasonPhrase = value; + } + + @override + Future get done => _doneFuture; + + @override + Future close() { + _completer.complete(); + return _doneFuture; + } + + @override + void add(List data) { + _buffer.addAll(data); + } + + @override + void addError(error, [StackTrace? stackTrace]) { + // doesn't seem to be hit...hmm... + } + + @override + Future redirect(Uri location, {int status = HttpStatus.movedTemporarily}) { + statusCode = status; + headers.set(HttpHeaders.locationHeader, location.toString()); + return close(); + } + + @override + void write(Object? obj) { + var str = obj.toString(); + add(utf8.encode(str)); + } + + /* + * Implemented to remove editor warnings + */ + @override + dynamic noSuchMethod(Invocation invocation) => super.noSuchMethod(invocation); + + String get fakeContent => utf8.decode(_buffer); + + List get fakeContentBinary => _buffer; + + bool get fakeDone => _isDone; + + // Copied from SDK http_impl.dart @ 845 on 2014-01-05 + // TODO: file an SDK bug to expose this on HttpStatus in some way + String? _findReasonPhrase(int statusCode) { + if (_reasonPhrase != null) { + return _reasonPhrase; + } + + switch (statusCode) { + case HttpStatus.notFound: + return 'Not Found'; + default: + return 'Status $statusCode'; + } + } +} diff --git a/common/http_server/test/http_multipart_test.dart b/common/http_server/test/http_multipart_test.dart new file mode 100644 index 0000000..0243434 --- /dev/null +++ b/common/http_server/test/http_multipart_test.dart @@ -0,0 +1,276 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:convert'; +import 'dart:io'; + +import 'package:platform_http_server/http_server.dart'; +import 'package:mime/mime.dart'; +import 'package:test/test.dart'; + +// Representation of a form field from a multipart/form-data form POST body. +class FormField { + // Name of the form field specified in Content-Disposition. + final String? name; + // Value of the form field. This is either a String or a List depending + // on the Content-Type. + final dynamic value; + // Content-Type of the form field. + final String? contentType; + // Filename if specified in Content-Disposition. + final String? filename; + + FormField(this.name, this.value, {this.contentType, this.filename}); + + @override + bool operator ==(other) => + other is FormField && + _valuesEqual(value, other.value) && + name == other.name && + contentType == other.contentType && + filename == other.filename; + + @override + int get hashCode => name.hashCode; + + @override + String toString() { + return "FormField('$name', '$value', '$contentType', '$filename')"; + } + + static bool _valuesEqual(a, b) { + if (a is String && b is String) { + return a == b; + } else if (a is List && b is List) { + if (a.length != b.length) { + return false; + } + for (var i = 0; i < a.length; i++) { + if (a[i] != b[i]) { + return false; + } + } + return true; + } + return false; + } +} + +Future _postDataTest(List message, String contentType, String boundary, + List expectedFields, + {Encoding defaultEncoding = latin1}) async { + var addr = (await InternetAddress.lookup('localhost'))[0]; + + var server = await HttpServer.bind(addr, 0); + + server.listen((request) async { + var boundary = request.headers.contentType!.parameters['boundary']!; + var fields = await MimeMultipartTransformer(boundary) + .bind(request) + .map((part) => + HttpMultipartFormData.parse(part, defaultEncoding: defaultEncoding)) + .asyncMap((multipart) async { + dynamic data; + if (multipart.isText) { + data = await multipart.join(); + } else { + data = await multipart + .fold>([], (b, s) => b..addAll(s as List)); + } + String? contentType; + if (multipart.contentType != null) { + contentType = multipart.contentType!.mimeType; + } + return FormField(multipart.contentDisposition.parameters['name'], data, + contentType: contentType, + filename: multipart.contentDisposition.parameters['filename']); + }).toList(); + expect(fields, equals(expectedFields)); + await request.response.close(); + await server.close(); + }); + + var client = HttpClient(); + + var request = await client.post('localhost', server.port, '/'); + + request.headers + .set('content-type', 'multipart/form-data; boundary=$boundary'); + request.add(message); + + await request.close(); + client.close(); + await server.close(force: true); +} + +void main() { + test('empty', () async { + var message0 = ''' +------WebKitFormBoundaryU3FBruSkJKG0Yor1--\r\n'''; + + await _postDataTest(message0.codeUnits, 'multipart/form-data', + '----WebKitFormBoundaryU3FBruSkJKG0Yor1', []); + }); + + test('test 1', () async { + var message = ''' +\r\n--AaB03x\r +Content-Disposition: form-data; name="submit-name"\r +\r +Larry\r +--AaB03x\r +Content-Disposition: form-data; name="files"; filename="file1.txt"\r +Content-Type: text/plain\r +\r +Content of file\r +--AaB03x--\r\n'''; + + await _postDataTest(message.codeUnits, 'multipart/form-data', 'AaB03x', [ + FormField('submit-name', 'Larry'), + FormField('files', 'Content of file', + contentType: 'text/plain', filename: 'file1.txt') + ]); + }); + + test('With content transfer encoding', () async { + var message = ''' +\r\n--AaB03x\r +Content-Disposition: form-data; name="submit-name"\r +Content-Transfer-Encoding: 8bit\r +\r +Larry\r +--AaB03x--\r\n'''; + + await _postDataTest(message.codeUnits, 'multipart/form-data', 'AaB03x', + [FormField('submit-name', 'Larry')]); + }); + + test('Windows/IE style file upload', () async { + var message = ''' +\r\n--AaB03x\r +Content-Disposition: form-data; name="files"; filename="C:\\file1\\".txt"\r +Content-Type: text/plain\r +\r +Content of file\r +--AaB03x--\r\n'''; + + await _postDataTest(message.codeUnits, 'multipart/form-data', 'AaB03x', [ + FormField('files', 'Content of file', + contentType: 'text/plain', filename: 'C:\\file1".txt') + ]); + }); + + test('Similar test using Chrome posting.', () async { + var message2 = [ + // Dartfmt, please do not touch. + 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, 105, 116, 70, 111, 114, 109, 66, + 111, 117, 110, 100, 97, 114, 121, 81, 83, 113, 108, 56, 107, 68, 65, 76, + 77, 55, 116, 65, 107, 67, 49, 13, 10, 67, 111, 110, 116, 101, 110, 116, + 45, 68, 105, 115, 112, 111, 115, 105, 116, 105, 111, 110, 58, 32, 102, + 111, 114, 109, 45, 100, 97, 116, 97, 59, 32, 110, 97, 109, 101, 61, 34, + 115, 117, 98, 109, 105, 116, 45, 110, 97, 109, 101, 34, 13, 10, 13, 10, + 84, 101, 115, 116, 13, 10, 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, 105, + 116, 70, 111, 114, 109, 66, 111, 117, 110, 100, 97, 114, 121, 81, 83, 113, + 108, 56, 107, 68, 65, 76, 77, 55, 116, 65, 107, 67, 49, 13, 10, 67, 111, + 110, 116, 101, 110, 116, 45, 68, 105, 115, 112, 111, 115, 105, 116, 105, + 111, 110, 58, 32, 102, 111, 114, 109, 45, 100, 97, 116, 97, 59, 32, 110, + 97, 109, 101, 61, 34, 102, 105, 108, 101, 115, 34, 59, 32, 102, 105, 108, + 101, 110, 97, 109, 101, 61, 34, 86, 69, 82, 83, 73, 79, 78, 34, 13, 10, + 67, 111, 110, 116, 101, 110, 116, 45, 84, 121, 112, 101, 58, 32, 97, 112, + 112, 108, 105, 99, 97, 116, 105, 111, 110, 47, 111, 99, 116, 101, 116, 45, + 115, 116, 114, 101, 97, 109, 13, 10, 13, 10, 123, 32, 10, 32, 32, 34, 114, + 101, 118, 105, 115, 105, 111, 110, 34, 58, 32, 34, 50, 49, 56, 54, 48, 34, + 44, 10, 32, 32, 34, 118, 101, 114, 115, 105, 111, 110, 34, 32, 58, 32, 34, + 48, 46, 49, 46, 50, 46, 48, 95, 114, 50, 49, 56, 54, 48, 34, 44, 10, 32, + 32, 34, 100, 97, 116, 101, 34, 32, 32, 32, 32, 58, 32, 34, 50, 48, 49, 51, + 48, 52, 50, 51, 48, 48, 48, 52, 34, 10, 125, 13, 10, 45, 45, 45, 45, 45, + 45, 87, 101, 98, 75, 105, 116, 70, 111, 114, 109, 66, 111, 117, 110, 100, + 97, 114, 121, 81, 83, 113, 108, 56, 107, 68, 65, 76, 77, 55, 116, 65, 107, + 67, 49, 45, 45, 13, 10 + ]; + + var data = [ + // Dartfmt, please do not touch. + 123, 32, 10, 32, 32, 34, 114, 101, 118, 105, 115, 105, 111, 110, 34, 58, + 32, 34, 50, 49, 56, 54, 48, 34, 44, 10, 32, 32, 34, 118, 101, 114, 115, + 105, 111, 110, 34, 32, 58, 32, 34, 48, 46, 49, 46, 50, 46, 48, 95, 114, + 50, 49, 56, 54, 48, 34, 44, 10, 32, 32, 34, 100, 97, 116, 101, 34, 32, 32, + 32, 32, 58, 32, 34, 50, 48, 49, 51, 48, 52, 50, 51, 48, 48, 48, 52, 34, + 10, 125 + ]; + + await _postDataTest(message2, 'multipart/form-data', + '----WebKitFormBoundaryQSql8kDALM7tAkC1', [ + FormField('submit-name', 'Test'), + FormField('files', data, + contentType: 'application/octet-stream', filename: 'VERSION') + ]); + }); + + test('HTML entity encoding in values in form fields', () async { + // In Chrome, Safari and Firefox HTML entity encoding might be used for + // values in form fields. The HTML entity encoding for ひらがな is + // ひらがな + var message3 = [ + // Dartfmt, please do not touch. + 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, 105, 116, 70, 111, 114, 109, 66, + 111, 117, 110, 100, 97, 114, 121, 118, 65, 86, 122, 117, 103, 75, 77, 116, + 90, 98, 121, 87, 111, 66, 71, 13, 10, 67, 111, 110, 116, 101, 110, 116, + 45, 68, 105, 115, 112, 111, 115, 105, 116, 105, 111, 110, 58, 32, 102, + 111, 114, 109, 45, 100, 97, 116, 97, 59, 32, 110, 97, 109, 101, 61, 34, + 110, 97, 109, 101, 34, 13, 10, 13, 10, 38, 35, 49, 50, 52, 48, 50, 59, 38, + 35, 49, 50, 52, 50, 53, 59, 38, 35, 49, 50, 51, 54, 52, 59, 38, 35, 49, + 50, 51, 57, 52, 59, 13, 10, 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, 105, + 116, 70, 111, 114, 109, 66, 111, 117, 110, 100, 97, 114, 121, 118, 65, 86, + 122, 117, 103, 75, 77, 116, 90, 98, 121, 87, 111, 66, 71, 45, 45, 13, 10 + ]; + + await _postDataTest( + message3, + 'multipart/form-data', + '----WebKitFormBoundaryvAVzugKMtZbyWoBG', + [FormField('name', 'ひらがな')], + defaultEncoding: utf8); + }); + + test('UTF', () async { + // The UTF-8 encoding of ひらがな is + // [227, 129, 178, 227, 130, 137, 227, 129, 140, 227, 129, 170]. + var message4 = [ + // Dartfmt, please do not touch. + 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, 105, 116, 70, 111, 114, 109, 66, + 111, 117, 110, 100, 97, 114, 121, 71, 88, 116, 66, 114, 99, 106, 120, 104, + 101, 75, 101, 78, 54, 105, 48, 13, 10, 67, 111, 110, 116, 101, 110, 116, + 45, 68, 105, 115, 112, 111, 115, 105, 116, 105, 111, 110, 58, 32, 102, + 111, 114, 109, 45, 100, 97, 116, 97, 59, 32, 110, 97, 109, 101, 61, 34, + 116, 101, 115, 116, 34, 13, 10, 13, 10, 227, 129, 178, 227, 130, 137, 227, + 129, 140, 227, 129, 170, 13, 10, 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, + 105, 116, 70, 111, 114, 109, 66, 111, 117, 110, 100, 97, 114, 121, 71, 88, + 116, 66, 114, 99, 106, 120, 104, 101, 75, 101, 78, 54, 105, 48, 45, 45, + 13, 10 + ]; + + await _postDataTest(message4, 'multipart/form-data', + '----WebKitFormBoundaryGXtBrcjxheKeN6i0', [FormField('test', 'ひらがな')], + defaultEncoding: utf8); + }); + + test('WebKit', () async { + var message5 = [ + // Dartfmt, please do not touch. + 45, 45, 45, 45, 45, 45, 87, 101, 98, 75, 105, 116, 70, 111, 114, 109, 66, + 111, 117, 110, 100, 97, 114, 121, 102, 101, 48, 69, 122, 86, 49, 97, 78, + 121, 115, 68, 49, 98, 80, 104, 13, 10, 67, 111, 110, 116, 101, 110, 116, + 45, 68, 105, 115, 112, 111, 115, 105, 116, 105, 111, 110, 58, 32, 102, + 111, 114, 109, 45, 100, 97, 116, 97, 59, 32, 110, 97, 109, 101, 61, 34, + 110, 97, 109, 101, 34, 13, 10, 13, 10, 248, 118, 13, 10, 45, 45, 45, 45, + 45, 45, 87, 101, 98, 75, 105, 116, 70, 111, 114, 109, 66, 111, 117, 110, + 100, 97, 114, 121, 102, 101, 48, 69, 122, 86, 49, 97, 78, 121, 115, 68, + 49, 98, 80, 104, 45, 45, 13, 10 + ]; + + await _postDataTest(message5, 'multipart/form-data', + '----WebKitFormBoundaryfe0EzV1aNysD1bPh', [FormField('name', 'øv')]); + }); +} diff --git a/common/http_server/test/pkcert/README b/common/http_server/test/pkcert/README new file mode 100644 index 0000000..fe764a9 --- /dev/null +++ b/common/http_server/test/pkcert/README @@ -0,0 +1,16 @@ +This is a certificate database used by Dart for testing purposes. + +It is created as a certificate database by NSS (Network Security Services), +a library from Mozilla, using the certutil tool. It uses a cert9.db file, +rather than a cert8.db file, so the database directory must be specified with +"sql:" in front of the directory path, or the environment variable +NSS_DEFAULT_DB_TYPE must be set to "sql". + +The password for the key database is "dartdart". + +The database contains a root certificate from Equifax, used to verify the +client https connection to www.google.dk. It contains a self-signed +certificate for a local certificate authority myauthority_cert, and a +server certificate for localhost called localhost_cert, signed by +myauthority_cert. It contains the key for localhost_cert, but +not the key for myauthority_cert. diff --git a/common/http_server/test/pkcert/cert9.db b/common/http_server/test/pkcert/cert9.db new file mode 100644 index 0000000..497fca6 Binary files /dev/null and b/common/http_server/test/pkcert/cert9.db differ diff --git a/common/http_server/test/pkcert/key4.db b/common/http_server/test/pkcert/key4.db new file mode 100644 index 0000000..fc06432 Binary files /dev/null and b/common/http_server/test/pkcert/key4.db differ diff --git a/common/http_server/test/utils.dart b/common/http_server/test/utils.dart new file mode 100644 index 0000000..d63a6f2 --- /dev/null +++ b/common/http_server/test/utils.dart @@ -0,0 +1,317 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; +import 'dart:mirrors'; +import 'dart:typed_data'; + +import 'package:platform_http_server/http_server.dart'; +import 'package:test/test.dart'; +import 'package:test_api/src/backend/invoker.dart'; + +import 'http_fakes.dart'; + +Object get currentTestCase => Invoker.current!.liveTest; + +late SecurityContext serverContext; +SecurityContext? clientContext; + +/// Used to flag a given test case as being a fake or not. +final _isFakeTestExpando = Expando('isFakeTest'); + +void testVirtualDir(String name, Future Function(Directory) func) { + _testVirtualDir(name, false, func); + _testVirtualDir(name, true, func); +} + +void _testVirtualDir( + String name, bool useFakes, Future Function(Directory) func) { + if (useFakes) { + name = '$name, with fakes'; + } + + test(name, () async { + // see subsequent access to this expando below + _isFakeTestExpando[currentTestCase] = useFakes; + + var dir = Directory.systemTemp.createTempSync('http_server_virtual_'); + + try { + await func(dir); + } finally { + await dir.delete(recursive: true); + } + }); +} + +Future statusCodeForVirtDir(VirtualDirectory virtualDir, String path, + {String? host, + bool secure = false, + DateTime? ifModifiedSince, + bool rawPath = false, + bool followRedirects = true, + int? from, + int? to}) async { + // if this is a fake test, then run the fake code path + if (_isFakeTestExpando[currentTestCase]!) { + var uri = _localhostUri(0, path, secure: secure, rawPath: rawPath); + + var request = FakeHttpRequest(uri, + followRedirects: followRedirects, + ifModifiedSince: ifModifiedSince, + data: StreamController().stream); + _addRangeHeader(request, from, to); + + var response = await _withFakeRequest(virtualDir, request); + return response.statusCode; + } + + assert(_isFakeTestExpando[currentTestCase] == false); + + return _withServer(virtualDir, (port) { + return fetchStatusCode(port, path, + host: host, + secure: secure, + ifModifiedSince: ifModifiedSince, + rawPath: rawPath, + followRedirects: followRedirects, + from: from, + to: to); + }); +} + +Future fetchStatusCode(int port, String path, + {String? host, + bool secure = false, + DateTime? ifModifiedSince, + bool rawPath = false, + bool followRedirects = true, + int? from, + int? to}) async { + var uri = _localhostUri(port, path, secure: secure, rawPath: rawPath); + + HttpClient client; + if (secure) { + client = HttpClient(context: clientContext); + } else { + client = HttpClient(); + } + + try { + var request = await client.getUrl(uri); + + if (!followRedirects) request.followRedirects = false; + if (host != null) request.headers.host = host; + if (ifModifiedSince != null) { + request.headers.ifModifiedSince = ifModifiedSince; + } + _addRangeHeader(request, from, to); + var response = await request.close(); + await response.drain(); + return response.statusCode; + } finally { + client.close(); + } +} + +Future fetchHEaders(VirtualDirectory virDir, String path, + {int? from, int? to}) async { + // if this is a fake test, then run the fake code path + if (_isFakeTestExpando[currentTestCase]!) { + var uri = _localhostUri(0, path); + + var request = + FakeHttpRequest(uri, data: StreamController().stream); + _addRangeHeader(request, from, to); + + var response = await _withFakeRequest(virDir, request); + return response.headers; + } + + assert(_isFakeTestExpando[currentTestCase] == false); + + return _withServer(virDir, (port) => _headers(port, path, from, to)); +} + +Future fetchAsString(VirtualDirectory virtualDir, String path) async { + // if this is a fake test, then run the fake code path + if (_isFakeTestExpando[currentTestCase]!) { + var uri = _localhostUri(0, path); + + var request = + FakeHttpRequest(uri, data: StreamController().stream); + + var response = await _withFakeRequest(virtualDir, request); + return response.fakeContent; + } + + assert(_isFakeTestExpando[currentTestCase] == false); + + return _withServer(virtualDir, (int port) => _fetchAsString(port, path)); +} + +Future> fetchAsBytes(VirtualDirectory virtualDir, String path, + {int? from, int? to}) async { + // if this is a fake test, then run the fake code path + if (_isFakeTestExpando[currentTestCase]!) { + var uri = _localhostUri(0, path); + + var request = + FakeHttpRequest(uri, data: StreamController().stream); + _addRangeHeader(request, from, to); + + var response = await _withFakeRequest(virtualDir, request); + return response.fakeContentBinary; + } + + assert(_isFakeTestExpando[currentTestCase] == false); + + return _withServer( + virtualDir, (int port) => _fetchAsBytes(port, path, from, to)); +} + +Future fetchContentAndResponse(VirtualDirectory virtualDir, String path, + {int? from, int? to}) async { + // if this is a fake test, then run the fake code path + if (_isFakeTestExpando[currentTestCase]!) { + var uri = _localhostUri(0, path); + + var request = + FakeHttpRequest(uri, data: StreamController().stream); + _addRangeHeader(request, from, to); + + var response = await _withFakeRequest(virtualDir, request); + return [response.fakeContentBinary, response]; + } + + assert(_isFakeTestExpando[currentTestCase] == false); + + return _withServer( + virtualDir, (int port) => _fetchContentAndResponse(port, path, from, to)); +} + +Future _withFakeRequest( + VirtualDirectory virDir, FakeHttpRequest request) async { + var value = await virDir.serveRequest(request); + + expect(value, isNull); + expect(request.response.fakeDone, isTrue); + + var response = request.response; + + if (response.statusCode == HttpStatus.movedPermanently || + response.statusCode == HttpStatus.movedTemporarily) { + if (request.followRedirects == true) { + var uri = Uri.parse(response.headers.value(HttpHeaders.locationHeader)!); + var newMock = FakeHttpRequest(uri, + followRedirects: true, data: StreamController().stream); + + return _withFakeRequest(virDir, newMock); + } + } + return response; +} + +Future _withServer( + VirtualDirectory virDir, Future Function(int port) func) async { + var server = await HttpServer.bind('localhost', 0); + + try { + virDir.serve(server); + return await func(server.port); + } finally { + await server.close(); + } +} + +Future _headers(int port, String path, int? from, int? to) async { + var client = HttpClient(); + try { + var request = await client.get('localhost', port, path); + _addRangeHeader(request, from, to); + var response = await request.close(); + await response.drain(); + return response.headers; + } finally { + client.close(); + } +} + +Future _fetchAsString(int port, String path) async { + var client = HttpClient(); + try { + var request = await client.get('localhost', port, path); + var response = await request.close(); + return await utf8.decodeStream(response.cast>()); + } finally { + client.close(); + } +} + +Future> _fetchAsBytes( + int port, String path, int? from, int? to) async { + var client = HttpClient(); + try { + var request = await client.get('localhost', port, path); + _addRangeHeader(request, from, to); + var response = await request.close(); + return await response.fold([], (p, e) => p..addAll(e)); + } finally { + client.close(); + } +} + +Future _fetchContentAndResponse( + int port, String path, int? from, int? to) async { + var client = HttpClient(); + try { + var request = await client.get('localhost', port, path); + _addRangeHeader(request, from, to); + var response = await request.close(); + var bytes = await response.fold>([], (p, e) => p..addAll(e)); + return [bytes, response]; + } finally { + client.close(); + } +} + +Uri _localhostUri(int port, String path, + {bool secure = false, bool rawPath = false}) { + if (rawPath) { + return Uri( + scheme: secure ? 'https' : 'http', + host: 'localhost', + port: port, + path: path); + } else { + return (secure + ? Uri.https('localhost:$port', path) + : Uri.http('localhost:$port', path)); + } +} + +void _addRangeHeader(request, int? from, int? to) { + var fromStr = from != null ? '$from' : ''; + var toStr = to != null ? '$to' : ''; + if (fromStr.isNotEmpty || toStr.isNotEmpty) { + request.headers.set(HttpHeaders.rangeHeader, 'bytes=$fromStr-$toStr'); + } +} + +void setupSecure() { + var currentFileUri = + (reflect(setupSecure) as ClosureMirror).function.location!.sourceUri; + + String localFile(String path) => currentFileUri.resolve(path).toFilePath(); + + serverContext = SecurityContext() + ..useCertificateChain(localFile('certificates/server_chain.pem')) + ..usePrivateKey(localFile('certificates/server_key.pem'), + password: 'dartdart'); + + clientContext = SecurityContext() + ..setTrustedCertificates(localFile('certificates/trusted_certs.pem')); +} diff --git a/common/http_server/test/virtual_directory_test.dart b/common/http_server/test/virtual_directory_test.dart new file mode 100644 index 0000000..b297895 --- /dev/null +++ b/common/http_server/test/virtual_directory_test.dart @@ -0,0 +1,688 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:io'; + +import 'package:platform_http_server/http_server.dart'; +import 'package:path/path.dart' as pathos; +import 'package:test/test.dart'; + +import 'utils.dart'; + +void _testEncoding(name, expected, [bool create = true]) { + testVirtualDir('encode-$name', (dir) async { + if (create) File('${dir.path}/$name').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = await statusCodeForVirtDir(virDir, '/$name'); + expect(result, expected); + }); +} + +void main() { + group('serve-root', () { + testVirtualDir('dir-exists', (dir) async { + var virDir = VirtualDirectory(dir.path); + + var result = await statusCodeForVirtDir(virDir, '/'); + expect(result, HttpStatus.notFound); + }); + + testVirtualDir('dir-not-exists', (dir) async { + var virDir = VirtualDirectory(pathos.join('${dir.path}foo')); + + var result = await statusCodeForVirtDir(virDir, '/'); + expect(result, HttpStatus.notFound); + }); + }); + + group('serve-file', () { + group('top-level', () { + testVirtualDir('file-exists', (dir) async { + File('${dir.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + var result = await statusCodeForVirtDir(virDir, '/file'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('file-not-exists', (dir) async { + var virDir = VirtualDirectory(dir.path); + + var result = await statusCodeForVirtDir(virDir, '/file'); + expect(result, HttpStatus.notFound); + }); + }); + + group('in-dir', () { + testVirtualDir('file-exists', (dir) async { + var dir2 = Directory('${dir.path}/dir')..createSync(); + File('${dir2.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + var result = await statusCodeForVirtDir(virDir, '/dir/file'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('file-not-exists', (dir) async { + Directory('${dir.path}/dir').createSync(); + File('${dir.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + + var result = await statusCodeForVirtDir(virDir, '/dir/file'); + expect(result, HttpStatus.notFound); + }); + }); + }); + + group('serve-dir', () { + group('top-level', () { + testVirtualDir('simple', (dir) async { + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('Index of /')); + }); + + testVirtualDir('files', (dir) async { + var virDir = VirtualDirectory(dir.path); + for (var i = 0; i < 10; i++) { + File('${dir.path}/$i').createSync(); + } + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('Index of /')); + }); + + testVirtualDir('dir-href', (dir) async { + var virDir = VirtualDirectory(dir.path); + Directory('${dir.path}/dir').createSync(); + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('')); + }); + + testVirtualDir('dirs', (dir) async { + var virDir = VirtualDirectory(dir.path); + for (var i = 0; i < 10; i++) { + Directory('${dir.path}/$i').createSync(); + } + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('Index of /')); + }); + + testVirtualDir('encoded-dir', (dir) async { + var virDir = VirtualDirectory(dir.path); + Directory('${dir.path}/alert(\'hacked!\');').createSync(); + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/alert(\'hacked!\');'); + expect(result, contains('/alert('hacked!');/')); + }); + + testVirtualDir('non-ascii-dir', (dir) async { + var virDir = VirtualDirectory(dir.path); + Directory('${dir.path}/æø').createSync(); + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('æø')); + }); + + testVirtualDir('content-type', (dir) async { + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var headers = await fetchHEaders(virDir, '/'); + var contentType = headers.contentType.toString(); + expect(contentType, 'text/html; charset=utf-8'); + }); + + if (!Platform.isWindows) { + testVirtualDir('recursive-link', (dir) async { + Link('${dir.path}/recursive').createSync('.'); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = await Future.wait([ + fetchAsString(virDir, '/') + .then((s) => s.contains('recursive/')), + fetchAsString(virDir, '/').then((s) => !s.contains('../')), + fetchAsString(virDir, '/') + .then((s) => s.contains('Index of /')), + fetchAsString(virDir, '/recursive') + .then((s) => s.contains('recursive/')), + fetchAsString(virDir, '/recursive') + .then((s) => s.contains('../')), + fetchAsString(virDir, '/recursive') + .then((s) => s.contains('Index of /recursive')) + ]); + expect(result, equals([true, true, true, true, true, true])); + }); + + testVirtualDir('encoded-path', (dir) async { + var virDir = VirtualDirectory(dir.path); + Directory('${dir.path}/javascript:alert(document);"').createSync(); + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('javascript%3Aalert(document)%3B%22/')); + }); + + testVirtualDir('encoded-special', (dir) async { + var virDir = VirtualDirectory(dir.path); + Directory('${dir.path}/<>&"').createSync(); + virDir.allowDirectoryListing = true; + + var result = await fetchAsString(virDir, '/'); + expect(result, contains('<>&"/')); + expect(result, contains('href="%3C%3E%26%22/"')); + }); + } + }); + + group('custom', () { + testVirtualDir('simple', (dir) async { + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (dir2, request) { + expect(dir2, isNotNull); + expect(FileSystemEntity.identicalSync(dir.path, dir2.path), isTrue); + request.response.write('My handler ${request.uri.path}'); + request.response.close(); + }; + + var result = await fetchAsString(virDir, '/'); + expect(result, 'My handler /'); + }); + + testVirtualDir('index-1', (dir) async { + File('${dir.path}/index.html').writeAsStringSync('index file'); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (dir2, request) { + // Redirect directory-requests to index.html files. + var indexUri = Uri.file(dir2.path).resolve('index.html'); + return virDir.serveFile(File(indexUri.toFilePath()), request); + }; + + var result = await fetchAsString(virDir, '/'); + expect(result, 'index file'); + }); + + testVirtualDir('index-2', (dir) async { + Directory('${dir.path}/dir').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + virDir.directoryHandler = (dir2, request) { + fail('not expected'); + }; + + var result = + await statusCodeForVirtDir(virDir, '/dir', followRedirects: false); + expect(result, 301); + }); + + testVirtualDir('index-3', (dir) async { + File('${dir.path}/dir/index.html') + ..createSync(recursive: true) + ..writeAsStringSync('index file'); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (dir2, request) { + // Redirect directory-requests to index.html files. + var indexUri = Uri.file(dir2.path).resolve('index.html'); + return virDir.serveFile(File(indexUri.toFilePath()), request); + }; + var result = await fetchAsString(virDir, '/dir'); + expect(result, 'index file'); + }); + + testVirtualDir('index-4', (dir) async { + File('${dir.path}/dir/index.html') + ..createSync(recursive: true) + ..writeAsStringSync('index file'); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (dir2, request) { + // Redirect directory-requests to index.html files. + var indexUri = Uri.file(dir2.path).resolve('index.html'); + virDir.serveFile(File(indexUri.toFilePath()), request); + }; + var result = await fetchAsString(virDir, '/dir/'); + expect(result, 'index file'); + }); + }); + + group('path-prefix', () { + testVirtualDir('simple', (dir) async { + var virDir = VirtualDirectory(dir.path, pathPrefix: '/path'); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (d, request) { + expect(FileSystemEntity.identicalSync(dir.path, d.path), isTrue); + request.response.close(); + }; + + var result = await statusCodeForVirtDir(virDir, '/path'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('trailing-slash', (dir) async { + var virDir = VirtualDirectory(dir.path, pathPrefix: '/path/'); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (d, request) { + expect(FileSystemEntity.identicalSync(dir.path, d.path), isTrue); + request.response.close(); + }; + + var result = await statusCodeForVirtDir(virDir, '/path'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('not-matching', (dir) async { + var virDir = VirtualDirectory(dir.path, pathPrefix: '/path/'); + var result = await statusCodeForVirtDir(virDir, '/'); + expect(result, HttpStatus.notFound); + }); + }); + }); + + group('links', () { + if (!Platform.isWindows) { + group('follow-links', () { + testVirtualDir('dir-link', (dir) async { + var dir2 = Directory('${dir.path}/dir2')..createSync(); + Link('${dir.path}/dir3').createSync('dir2'); + File('${dir2.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.followLinks = true; + + var result = await statusCodeForVirtDir(virDir, '/dir3/file'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('root-link', (dir) async { + Link('${dir.path}/dir3').createSync('.'); + File('${dir.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.followLinks = true; + + var result = await statusCodeForVirtDir(virDir, '/dir3/file'); + expect(result, HttpStatus.ok); + }); + + group('bad-links', () { + testVirtualDir('absolute-link', (dir) async { + File('${dir.path}/file').createSync(); + Link('${dir.path}/file2').createSync('${dir.path}/file'); + var virDir = VirtualDirectory(dir.path); + virDir.followLinks = true; + + var result = await statusCodeForVirtDir(virDir, '/file2'); + expect(result, HttpStatus.notFound); + }); + + testVirtualDir('relative-parent-link', (dir) async { + var dir2 = Directory('${dir.path}/dir')..createSync(); + File('${dir.path}/file').createSync(); + Link('${dir2.path}/file').createSync('../file'); + var virDir = VirtualDirectory(dir2.path); + virDir.followLinks = true; + + var result = await statusCodeForVirtDir(virDir, '/dir3/file'); + expect(result, HttpStatus.notFound); + }); + }); + }); + + group('not-follow-links', () { + testVirtualDir('dir-link', (dir) async { + var dir2 = Directory('${dir.path}/dir2')..createSync(); + Link('${dir.path}/dir3').createSync('dir2'); + File('${dir2.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.followLinks = false; + + var result = await statusCodeForVirtDir(virDir, '/dir3/file'); + expect(result, HttpStatus.notFound); + }); + }); + + group('follow-links', () { + group('no-root-jail', () { + testVirtualDir('absolute-link', (dir) async { + File('${dir.path}/file').createSync(); + Link('${dir.path}/file2').createSync('${dir.path}/file'); + var virDir = VirtualDirectory(dir.path); + virDir.followLinks = true; + virDir.jailRoot = false; + + var result = await statusCodeForVirtDir(virDir, '/file2'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('relative-parent-link', (dir) async { + var dir2 = Directory('${dir.path}/dir')..createSync(); + File('${dir.path}/file').createSync(); + Link('${dir2.path}/file').createSync('../file'); + var virDir = VirtualDirectory(dir2.path); + virDir.followLinks = true; + virDir.jailRoot = false; + + var result = await statusCodeForVirtDir(virDir, '/file'); + expect(result, HttpStatus.ok); + }); + }); + }); + } + }); + + group('last-modified', () { + group('file', () { + testVirtualDir('file-exists', (dir) async { + File('${dir.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + + var headers = await fetchHEaders(virDir, '/file'); + expect(headers.value(HttpHeaders.lastModifiedHeader), isNotNull); + var lastModified = + HttpDate.parse(headers.value(HttpHeaders.lastModifiedHeader)!); + + var result = await statusCodeForVirtDir(virDir, '/file', + ifModifiedSince: lastModified); + expect(result, HttpStatus.notModified); + }); + + testVirtualDir('file-changes', (dir) async { + File('${dir.path}/file').createSync(); + var virDir = VirtualDirectory(dir.path); + + var headers = await fetchHEaders(virDir, '/file'); + expect(headers.value(HttpHeaders.lastModifiedHeader), isNotNull); + var lastModified = + HttpDate.parse(headers.value(HttpHeaders.lastModifiedHeader)!); + + // Fake file changed by moving date back in time. + lastModified = lastModified.subtract(const Duration(seconds: 10)); + + var result = await statusCodeForVirtDir(virDir, '/file', + ifModifiedSince: lastModified); + expect(result, HttpStatus.ok); + }); + }); + }); + + group('content-type', () { + group('mime-type', () { + testVirtualDir('from-path', (dir) async { + File('${dir.path}/file.jpg').createSync(); + var virDir = VirtualDirectory(dir.path); + + var headers = await fetchHEaders(virDir, '/file.jpg'); + var contentType = headers.contentType.toString(); + expect(contentType, 'image/jpeg'); + }); + + testVirtualDir('from-magic-number', (dir) async { + var file = File('${dir.path}/file.jpg')..createSync(); + file.writeAsBytesSync([0x89, 0x50, 0x4E, 0x47, 0x0D, 0x0A, 0x1A, 0x0A]); + var virDir = VirtualDirectory(dir.path); + + var headers = await fetchHEaders(virDir, '/file.jpg'); + var contentType = headers.contentType.toString(); + expect(contentType, 'image/png'); + }); + }); + }); + + group('range', () { + var fileContent = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]; + late VirtualDirectory virDir; + + void prepare(Directory dir) { + File('${dir.path}/file').writeAsBytesSync(fileContent); + virDir = VirtualDirectory(dir.path); + } + + testVirtualDir('range', (dir) async { + prepare(dir); + Future check(int from, int to, + [List? expected, String? contentRange]) async { + expected ??= fileContent.sublist(from, to + 1); + contentRange ??= 'bytes $from-$to/${fileContent.length}'; + var result = + await fetchContentAndResponse(virDir, '/file', from: from, to: to); + var content = result[0]; + var response = result[1]; + expect(content, expected); + expect( + response.headers[HttpHeaders.contentRangeHeader][0], contentRange); + expect(expected.length, response.headers.contentLength); + expect(response.statusCode, HttpStatus.partialContent); + } + + await check(0, 0); + await check(0, 1); + await check(1, 2); + await check(1, 9); + await check(0, 9); + await check(8, 9); + await check(9, 9); + await check(0, 10, fileContent, 'bytes 0-9/10'); + await check(9, 10, [9], 'bytes 9-9/10'); + await check(0, 1000, fileContent, 'bytes 0-9/10'); + }); + + testVirtualDir('prefix-range', (dir) async { + prepare(dir); + Future check(int from, + [List? expected, + String? contentRange, + bool expectContentRange = true, + int expectedStatusCode = HttpStatus.partialContent]) async { + expected ??= fileContent.sublist(from, fileContent.length); + if (contentRange == null && expectContentRange) { + contentRange = 'bytes $from-' + '${fileContent.length - 1}/' + '${fileContent.length}'; + } + var result = await fetchContentAndResponse(virDir, '/file', from: from); + var content = result[0]; + var response = result[1]; + expect(content, expected); + if (expectContentRange) { + expect(response.headers[HttpHeaders.contentRangeHeader][0], + contentRange); + } else { + expect(response.headers[HttpHeaders.contentRangeHeader], null); + } + expect(response.statusCode, expectedStatusCode); + } + + await check(0); + await check(1); + await check(9); + await check(10, fileContent, null, false, HttpStatus.ok); + await check(11, fileContent, null, false, HttpStatus.ok); + await check(1000, fileContent, null, false, HttpStatus.ok); + }); + + testVirtualDir('suffix-range', (dir) async { + prepare(dir); + Future check(int to, + [List? expected, String? contentRange]) async { + expected ??= + fileContent.sublist(fileContent.length - to, fileContent.length); + contentRange ??= 'bytes ${fileContent.length - to}-' + '${fileContent.length - 1}/' + '${fileContent.length}'; + var result = await fetchContentAndResponse(virDir, '/file', to: to); + var content = result[0]; + var response = result[1]; + expect(content, expected); + expect( + response.headers[HttpHeaders.contentRangeHeader][0], contentRange); + expect(response.statusCode, HttpStatus.partialContent); + } + + await check(1); + await check(2); + await check(9); + await check(10); + await check(11, fileContent, 'bytes 0-9/10'); + await check(1000, fileContent, 'bytes 0-9/10'); + }); + + testVirtualDir('unsatisfiable-range', (dir) async { + prepare(dir); + Future check(int from, int to) async { + var result = + await fetchContentAndResponse(virDir, '/file', from: from, to: to); + var content = result[0]; + var response = result[1]; + expect(content.length, 0); + expect(response.headers[HttpHeaders.contentRangeHeader], isNull); + expect(response.statusCode, HttpStatus.requestedRangeNotSatisfiable); + } + + await check(10, 11); + await check(10, 1000); + await check(1000, 1000); + }); + + testVirtualDir('invalid-range', (dir) async { + prepare(dir); + Future check(int? from, int to) async { + var result = + await fetchContentAndResponse(virDir, '/file', from: from, to: to); + var content = result[0]; + var response = result[1]; + expect(content, fileContent); + expect(response.headers[HttpHeaders.contentRangeHeader], isNull); + expect(response.statusCode, HttpStatus.ok); + } + + await check(1, 0); + await check(10, 0); + await check(1000, 999); + await check(null, 0); // This is effectively range 10-9. + }); + }); + + group('error-page', () { + testVirtualDir('default', (dir) async { + var virDir = VirtualDirectory(pathos.join(dir.path, 'foo')); + + var result = await fetchAsString(virDir, '/'); + expect(result, matches(RegExp('404.*Not Found'))); + }); + + testVirtualDir('custom', (dir) async { + var virDir = VirtualDirectory(pathos.join(dir.path, 'foo')); + + virDir.errorPageHandler = (request) { + request.response.write('my-page '); + request.response.write(request.response.statusCode); + request.response.close(); + }; + + var result = await fetchAsString(virDir, '/'); + expect(result, 'my-page 404'); + }); + }); + + group('escape-root', () { + testVirtualDir('escape1', (dir) async { + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = await statusCodeForVirtDir(virDir, '/../'); + expect(result, HttpStatus.notFound); + }); + + testVirtualDir('escape2', (dir) async { + Directory('${dir.path}/dir').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = await statusCodeForVirtDir(virDir, '/dir/../../'); + expect(result, HttpStatus.notFound); + }); + }, + skip: 'Broken. Likely due to dart:core Uri changes.' + 'See https://github.com/dart-lang/http_server/issues/40'); + + group('url-decode', () { + testVirtualDir('with-space', (dir) async { + File('${dir.path}/my file').createSync(); + var virDir = VirtualDirectory(dir.path); + + var result = await statusCodeForVirtDir(virDir, '/my file'); + expect(result, HttpStatus.ok); + }); + + testVirtualDir('encoded-space', (dir) async { + File('${dir.path}/my file').createSync(); + var virDir = VirtualDirectory(dir.path); + + var result = await statusCodeForVirtDir(virDir, '/my%20file'); + expect(result, HttpStatus.notFound); + }); + + testVirtualDir('encoded-path-separator', (dir) async { + Directory('${dir.path}/a').createSync(); + Directory('${dir.path}/a/b').createSync(); + Directory('${dir.path}/a/b/c').createSync(); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = + await statusCodeForVirtDir(virDir, '/a%2fb/c', rawPath: true); + expect(result, HttpStatus.notFound); + }); + + testVirtualDir('encoded-null', (dir) async { + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + + var result = await statusCodeForVirtDir(virDir, '/%00', rawPath: true); + expect(result, HttpStatus.notFound); + }); + + group('broken', () { + _testEncoding('..', HttpStatus.notFound, false); + }, + skip: 'Broken. Likely due to dart:core Uri changes.' + 'See https://github.com/dart-lang/http_server/issues/40'); + + _testEncoding('%2e%2e', HttpStatus.ok); + _testEncoding('%252e%252e', HttpStatus.ok); + _testEncoding('/', HttpStatus.ok, false); + _testEncoding('%2f', HttpStatus.notFound, false); + _testEncoding('%2f', HttpStatus.ok, true); + }); + + group('serve-file', () { + testVirtualDir('from-dir-handler', (dir) async { + File('${dir.path}/file').writeAsStringSync('file contents'); + var virDir = VirtualDirectory(dir.path); + virDir.allowDirectoryListing = true; + virDir.directoryHandler = (d, request) { + expect(FileSystemEntity.identicalSync(dir.path, d.path), isTrue); + return virDir.serveFile(File('${d.path}/file'), request); + }; + + var result = await fetchAsString(virDir, '/'); + expect(result, 'file contents'); + var headers = await fetchHEaders(virDir, '/'); + expect('file contents'.length, headers.contentLength); + }); + }); +} diff --git a/common/http_server/test/virtual_host_test.dart b/common/http_server/test/virtual_host_test.dart new file mode 100644 index 0000000..45b9690 --- /dev/null +++ b/common/http_server/test/virtual_host_test.dart @@ -0,0 +1,157 @@ +// Copyright (c) 2013, the Dart project authors. Please see the AUTHORS file +// for details. All rights reserved. Use of this source code is governed by a +// BSD-style license that can be found in the LICENSE file. + +import 'dart:async'; +import 'dart:io'; + +import 'package:test/test.dart'; +import 'package:platform_http_server/http_server.dart'; + +import 'utils.dart'; + +void main() { + setUpAll(setupSecure); + + group('virtual host', () { + late HttpServer server; + late VirtualHost virHost; + + setUp(() async { + server = await HttpServer.bind('localhost', 0); + virHost = VirtualHost(server); + }); + + tearDown(() async { + await server.close(); + }); + test('empty-host', () async { + var statusCode = await fetchStatusCode(server.port, '/'); + expect(statusCode, equals(HttpStatus.forbidden)); + }); + + test('empty-host-unhandled', () async { + var statusCodes = fetchStatusCode(server.port, '/'); + var request = await virHost.unhandled.first; + await request.response.close(); + expect(await statusCodes, equals(HttpStatus.ok)); + }); + + test('single-host', () async { + var host = virHost.addHost('*.host.com'); + var statusCode = fetchStatusCode(server.port, '/', host: 'my.host.com'); + var request = await host.first; + await request.response.close(); + expect(await statusCode, equals(HttpStatus.ok)); + }); + + test('multiple-host', () async {}); + + group('domain', () { + test('specific-sub-domain', () async { + var hosts = [ + virHost.addHost('my1.host.com'), + virHost.addHost('my2.host.com'), + virHost.addHost('my3.host.com'), + ]; + var statusCodes = [ + fetchStatusCode(server.port, '/', host: 'my1.host.com'), + fetchStatusCode(server.port, '/', host: 'my2.host.com'), + fetchStatusCode(server.port, '/', host: 'my3.host.com'), + ]; + for (var host in hosts) { + var request = await host.first; + await request.response.close(); + } + expect(await Future.wait(statusCodes), + equals([HttpStatus.ok, HttpStatus.ok, HttpStatus.ok])); + }); + + test('wildcard-sub-domain', () async { + var hosts = [ + virHost.addHost('*.host1.com'), + virHost.addHost('*.host2.com'), + virHost.addHost('*.host3.com'), + ]; + var statusCodes = [ + fetchStatusCode(server.port, '/', host: 'my.host1.com'), + fetchStatusCode(server.port, '/', host: 'my.host2.com'), + fetchStatusCode(server.port, '/', host: 'my.host3.com'), + ]; + for (var host in hosts) { + var request = await host.first; + await request.response.close(); + } + expect(await Future.wait(statusCodes), + equals([HttpStatus.ok, HttpStatus.ok, HttpStatus.ok])); + }); + + test('mix-sub-domain', () async { + var hosts = [ + virHost.addHost('my1.host.com'), + virHost.addHost('my2.host.com'), + virHost.addHost('*.host.com'), + ]; + var statusCodes = [ + fetchStatusCode(server.port, '/', host: 'my1.host.com'), + fetchStatusCode(server.port, '/', host: 'my2.host.com'), + fetchStatusCode(server.port, '/', host: 'my3.host.com'), + ]; + for (var host in hosts) { + var request = await host.first; + await request.response.close(); + } + expect(await Future.wait(statusCodes), + equals([HttpStatus.ok, HttpStatus.ok, HttpStatus.ok])); + }); + + test('wildcard', () async { + var hosts = [ + virHost.addHost('*'), + virHost.addHost('*.com'), + virHost.addHost('*.host.com'), + ]; + var statusCodes = [ + fetchStatusCode(server.port, '/', host: 'some.host.dk'), + fetchStatusCode(server.port, '/', host: 'my.host2.com'), + fetchStatusCode(server.port, '/', host: 'long.sub.of.host.com'), + ]; + for (var host in hosts) { + var request = await host.first; + await request.response.close(); + } + expect(await Future.wait(statusCodes), + equals([HttpStatus.ok, HttpStatus.ok, HttpStatus.ok])); + }); + }); + + test('multiple-source-https', () async { + var secondServer = + await HttpServer.bindSecure('localhost', 0, serverContext); + virHost.addSource(secondServer); + virHost.unhandled.listen((request) { + request.response.close(); + }); + var statusCodes = await Future.wait([ + fetchStatusCode(server.port, '/', host: 'myhost1.com'), + fetchStatusCode(secondServer.port, '/', + host: 'myhost2.com', secure: true) + ]); + expect(statusCodes, [HttpStatus.ok, HttpStatus.ok]); + await secondServer.close(); + }); + + test('duplicate-domain', () { + var virHost = VirtualHost(); + virHost.addHost('my1.host.com'); + expect(() => (virHost.addHost('my1.host.com')), throwsArgumentError); + virHost.addHost('*.host.com'); + expect(() => (virHost.addHost('*.host.com')), throwsArgumentError); + virHost.addHost('my2.host.com'); + virHost.addHost('my3.host.com'); + virHost.addHost('*.com'); + virHost.addHost('*'); + expect(() => (virHost.addHost('*')), throwsArgumentError); + }); + }); +} diff --git a/common/json_serializer/AUTHORS.md b/common/json_serializer/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/json_serializer/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/json_serializer/CHANGELOG.md b/common/json_serializer/CHANGELOG.md new file mode 100644 index 0000000..2603fcd --- /dev/null +++ b/common/json_serializer/CHANGELOG.md @@ -0,0 +1,66 @@ +# Change Log + +## 7.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 7.1.0 + +* Updated `lints` to 3.0.0 +* Fixed lints warnings + +## 7.0.0 + +* Require Dart >= 3.0 + +## 7.0.0-beta.1 + +* Require Dart >= 3.0 + +## 6.0.1 + +* Updated README + +## 6.0.0 + +* Require Dart >= 2.17 + +## 5.0.0 + +* Added `lints` linter +* Removed deprecated parameters +* Published as `platform_json_serializer` package + +## 4.0.3 + +* Fixed static analysis errors + +## 4.0.2 + +* Updated pubspec description +* Fixed static analysis errors + +## 4.0.1 + +* Added example +* Updated README + +## 4.0.0 + +* Migrated to support Dart SDK 2.12.x NNBD + +## 3.0.0 + +* Migrated to work with Dart SDK 2.12.x Non NNBD + +## 2.0.0-beta+3 + +* Long-needed updates, ensured Dart 2 compatibility, fixed DDC breakages. +* Patches for reflection bugs with typing. + +## 2.0.0-beta+2 + +* This version breaks in certain Dart versions (likely anything *after* `2.0.0-dev.59.0`) +until is resolved. +* Removes the reference to `Schema` class. diff --git a/common/json_serializer/LICENSE b/common/json_serializer/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/json_serializer/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/json_serializer/README.md b/common/json_serializer/README.md new file mode 100644 index 0000000..caff224 --- /dev/null +++ b/common/json_serializer/README.md @@ -0,0 +1,109 @@ +# Belatuk JSON Serializer + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_json_serializer?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![Gitter](https://img.shields.io/gitter/room/angel_dart/discussion)](https://gitter.im/angel_dart/discussion) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/json_serializer/LICENSE) + +**Replacement of `package:json_god` with breaking changes to support NNBD.** + +The ***new and improved*** definitive solution for JSON in Dart. It supports synchronously transform an object into a JSON string and also deserialize a JSON string back into an instance of any type. + +## Installation + +```yaml + dependencies: + platform_json_serializer: ^7.1.0 +``` + +## Usage + +It is recommended to import the library under an alias, i.e., `jsonSerializer`. + +```dart +import 'package:belatuk_json_serialization/belatuk_json_serialization.dart' as jsonSerializer; +``` + +## Serializing JSON + +Simply call `jsonSerializer.serialize(x)` to synchronously transform an object into a JSON +string. + +```dart +Map map = {"foo": "bar", "numbers": [1, 2, {"three": 4}]}; + +// Output: {"foo":"bar","numbers":[1,2,{"three":4]"} +String json = jsonSerializer.serialize(map); +print(json); +``` + +You can easily serialize classes, too. Belatuk JSON Serializer also supports classes as members. + +```dart + +class A { + String foo; + A(this.foo); +} + +class B { + late String hello; + late A nested; + B(String hello, String foo) { + this.hello = hello; + this.nested = A(foo); + } +} + +main() { + print(jsonSerializer.serialize( B("world", "bar"))); +} + +// Output: {"hello":"world","nested":{"foo":"bar"}} +``` + +If a class has a `toJson` method, it will be called instead. + +## Deserializing JSON + +Deserialization is equally easy, and is provided through `jsonSerializer.deserialize`. + +```dart +Map map = jsonSerializer.deserialize('{"hello":"world"}'); +int three = jsonSerializer.deserialize("3"); +``` + +### Deserializing to Classes + +Belatuk JSON Serializer lets you deserialize JSON into an instance of any type. Simply pass the type as the second argument to `jsonSerializer.deserialize`. + +If the class has a `fromJson` constructor, it will be called instead. + +```dart +class Child { + String foo; +} + +class Parent { + String hello; + Child child = Child(); +} + +main() { + Parent parent = jsonSerializer.deserialize('{"hello":"world","child":{"foo":"bar"}}', Parent); + print(parent); +} +``` + +**Any JSON-deserializable classes must initializable without parameters. If `Foo()` would throw an error, then you can't use Foo with JSON.** + +This allows for validation of a sort, as only fields you have declared will be accepted. + +```dart +class HasAnInt { int theInt; } + +HasAnInt invalid = jsonSerializer.deserialize('["some invalid input"]', HasAnInt); +// Throws an error +``` + +An exception will be thrown if validation fails. diff --git a/common/json_serializer/analysis_options.yaml b/common/json_serializer/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/json_serializer/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/json_serializer/example/main.dart b/common/json_serializer/example/main.dart new file mode 100644 index 0000000..faeccd7 --- /dev/null +++ b/common/json_serializer/example/main.dart @@ -0,0 +1,18 @@ +import 'package:platform_json_serializer/json_serializer.dart' as god; + +class A { + String foo; + A(this.foo); +} + +class B { + String hello; + late A nested; + B(this.hello, String foo) { + nested = A(foo); + } +} + +void main() { + print(god.serialize(B("world", "bar"))); +} diff --git a/common/json_serializer/lib/json_serializer.dart b/common/json_serializer/lib/json_serializer.dart new file mode 100644 index 0000000..5f15641 --- /dev/null +++ b/common/json_serializer/lib/json_serializer.dart @@ -0,0 +1,18 @@ +/// A robust library for JSON serialization and deserialization. +library json_serializer; + +//import 'package:dart2_constant/convert.dart'; +import 'dart:convert'; +import 'package:logging/logging.dart'; +import 'src/reflection.dart' as reflection; + +part 'src/serialize.dart'; +part 'src/deserialize.dart'; +part 'src/validation.dart'; +part 'src/util.dart'; + +/// Instead, listen to [logger]. +//@deprecated +//bool debug = false; + +final Logger logger = Logger('platform_json_serializer'); diff --git a/common/json_serializer/lib/src/deserialize.dart b/common/json_serializer/lib/src/deserialize.dart new file mode 100644 index 0000000..cf5626d --- /dev/null +++ b/common/json_serializer/lib/src/deserialize.dart @@ -0,0 +1,44 @@ +part of '../json_serializer.dart'; + +/// Deserializes a JSON string into a Dart datum. +/// +/// You can also provide an output Type to attempt to serialize the JSON into. +deserialize(String json, {Type? outputType}) { + var deserialized = deserializeJson(json, outputType: outputType); + logger.info("Deserialization result: $deserialized"); + return deserialized; +} + +/// Deserializes JSON into data, without validating it. +deserializeJson(String s, {Type? outputType}) { + logger.info("Deserializing the following JSON: $s"); + + if (outputType == null) { + logger + .info("No output type was specified, so we are just using json.decode"); + return json.decode(s); + } else { + logger.info("Now deserializing to type: $outputType"); + return deserializeDatum(json.decode(s), outputType: outputType); + } +} + +/// Deserializes some JSON-serializable value into a usable Dart value. +deserializeDatum(value, {Type? outputType}) { + if (outputType != null) { + return reflection.deserialize(value, outputType, deserializeDatum); + } else if (value is List) { + logger.info("Deserializing this List: $value"); + return value.map(deserializeDatum).toList(); + } else if (value is Map) { + logger.info("Deserializing this Map: $value"); + Map result = {}; + value.forEach((k, v) { + result[k] = deserializeDatum(v); + }); + return result; + } else if (_isPrimitive(value)) { + logger.info("Value $value is a primitive"); + return value; + } +} diff --git a/common/json_serializer/lib/src/reflection.dart b/common/json_serializer/lib/src/reflection.dart new file mode 100644 index 0000000..d3659a7 --- /dev/null +++ b/common/json_serializer/lib/src/reflection.dart @@ -0,0 +1,188 @@ +library platform_json_serializer.reflection; + +import 'dart:mirrors'; +import '../json_serializer.dart'; + +const Symbol hashCodeSymbol = #hashCode; +const Symbol runtimeTypeSymbol = #runtimeType; + +typedef Serializer = dynamic Function(dynamic value); +typedef Deserializer = dynamic Function(dynamic value, {Type? outputType}); + +List _findGetters(ClassMirror classMirror) { + List result = []; + + classMirror.instanceMembers + .forEach((Symbol symbol, MethodMirror methodMirror) { + if (methodMirror.isGetter && + symbol != hashCodeSymbol && + symbol != runtimeTypeSymbol) { + logger.info("Found getter on instance: $symbol"); + result.add(symbol); + } + }); + + return result; +} + +serialize(value, Serializer serializer) { + logger.info("Serializing this value via reflection: $value"); + Map result = {}; + InstanceMirror instanceMirror = reflect(value); + ClassMirror classMirror = instanceMirror.type; + + // Check for toJson + for (Symbol symbol in classMirror.instanceMembers.keys) { + if (symbol == #toJson) { + logger.info("Running toJson..."); + var result = instanceMirror.invoke(symbol, []).reflectee; + logger.info("Result of serialization via reflection: $result"); + return result; + } + } + + for (Symbol symbol in _findGetters(classMirror)) { + String name = MirrorSystem.getName(symbol); + var valueForSymbol = instanceMirror.getField(symbol).reflectee; + + try { + result[name] = serializer(valueForSymbol); + logger.info("Set $name to $valueForSymbol"); + } catch (e, st) { + logger.severe("Could not set $name to $valueForSymbol", e, st); + } + } + + logger.info("Result of serialization via reflection: $result"); + + return result; +} + +deserialize(value, Type outputType, Deserializer deserializer) { + logger.info("About to deserialize $value to a $outputType"); + + try { + if (value is List) { + List typeArguments = reflectType(outputType).typeArguments; + + Iterable it; + + if (typeArguments.isEmpty) { + it = value.map(deserializer); + } else { + it = value.map((item) => + deserializer(item, outputType: typeArguments[0].reflectedType)); + } + + if (typeArguments.isEmpty) return it.toList(); + logger.info( + 'Casting list elements to ${typeArguments[0].reflectedType} via List.from'); + + var mirror = reflectType(List, [typeArguments[0].reflectedType]); + + if (mirror is ClassMirror) { + var output = mirror.newInstance(#from, [it]).reflectee; + logger.info('Casted list type: ${output.runtimeType}'); + return output; + } else { + throw ArgumentError( + '${typeArguments[0].reflectedType} is not a class.'); + } + } else if (value is Map) { + return _deserializeFromJsonByReflection(value, deserializer, outputType); + } else { + return deserializer(value); + } + } catch (e, st) { + logger.severe('Deserialization failed.', e, st); + rethrow; + } +} + +/// Uses mirrors to deserialize an object. +_deserializeFromJsonByReflection( + data, Deserializer deserializer, Type outputType) { + // Check for fromJson + var typeMirror = reflectType(outputType); + + if (typeMirror is! ClassMirror) { + throw ArgumentError('$outputType is not a class.'); + } + + var type = typeMirror; + var fromJson = Symbol('${MirrorSystem.getName(type.simpleName)}.fromJson'); + + for (Symbol symbol in type.declarations.keys) { + if (symbol == fromJson) { + var decl = type.declarations[symbol]; + + if (decl is MethodMirror && decl.isConstructor) { + logger.info("Running fromJson..."); + var result = type.newInstance(#fromJson, [data]).reflectee; + + logger.info("Result of deserialization via reflection: $result"); + return result; + } + } + } + + ClassMirror classMirror = type; + InstanceMirror instanceMirror = classMirror.newInstance(Symbol(""), []); + + if (classMirror.isSubclassOf(reflectClass(Map))) { + var typeArguments = classMirror.typeArguments; + + if (typeArguments.isEmpty || + classMirror.typeArguments + .every((t) => t == currentMirrorSystem().dynamicType)) { + return data; + } else { + var mapType = + reflectType(Map, typeArguments.map((t) => t.reflectedType).toList()) + as ClassMirror; + logger.info('Casting this map $data to Map of [$typeArguments]'); + var output = mapType.newInstance(Symbol(''), []).reflectee; + + for (var key in data.keys) { + output[key] = data[key]; + } + + logger.info('Output: $output of type ${output.runtimeType}'); + return output; + } + } else { + data.keys.forEach((key) { + try { + logger.info("Now deserializing value for $key"); + logger.info("data[\"$key\"] = ${data[key]}"); + var deserializedValue = deserializer(data[key]); + + logger.info( + "I want to set $key to the following ${deserializedValue.runtimeType}: $deserializedValue"); + // Get target type of getter + Symbol searchSymbol = Symbol(key.toString()); + Symbol symbolForGetter = classMirror.instanceMembers.keys + .firstWhere((x) => x == searchSymbol); + Type requiredType = classMirror + .instanceMembers[symbolForGetter]!.returnType.reflectedType; + if (data[key].runtimeType != requiredType) { + logger.info("Currently, $key is a ${data[key].runtimeType}."); + logger.info("However, $key must be a $requiredType."); + + deserializedValue = + deserializer(deserializedValue, outputType: requiredType); + } + + logger.info( + "Final deserialized value for $key: $deserializedValue <${deserializedValue.runtimeType}>"); + instanceMirror.setField(Symbol(key.toString()), deserializedValue); + + logger.info("Success! $key has been set to $deserializedValue"); + } catch (e, st) { + logger.severe('Could not set value for field $key.', e, st); + } + }); + } + + return instanceMirror.reflectee; +} diff --git a/common/json_serializer/lib/src/serialize.dart b/common/json_serializer/lib/src/serialize.dart new file mode 100644 index 0000000..a7fe825 --- /dev/null +++ b/common/json_serializer/lib/src/serialize.dart @@ -0,0 +1,36 @@ +part of '../json_serializer.dart'; + +/// Serializes any arbitrary Dart datum to JSON. Supports schema validation. +String serialize(value) { + var serialized = serializeObject(value); + logger.info('Serialization result: $serialized'); + return json.encode(serialized); +} + +/// Transforms any Dart datum into a value acceptable to json.encode. +serializeObject(value) { + if (_isPrimitive(value)) { + logger.info("Serializing primitive value: $value"); + return value; + } else if (value is DateTime) { + logger.info("Serializing this DateTime: $value"); + return value.toIso8601String(); + } else if (value is Iterable) { + logger.info("Serializing this Iterable: $value"); + return value.map(serializeObject).toList(); + } else if (value is Map) { + logger.info("Serializing this Map: $value"); + return serializeMap(value); + } else { + return serializeObject(reflection.serialize(value, serializeObject)); + } +} + +/// Recursively transforms a Map and its children into JSON-serializable data. +Map serializeMap(Map value) { + Map outputMap = {}; + value.forEach((key, value) { + outputMap[key] = serializeObject(value); + }); + return outputMap; +} diff --git a/common/json_serializer/lib/src/util.dart b/common/json_serializer/lib/src/util.dart new file mode 100644 index 0000000..54dedb3 --- /dev/null +++ b/common/json_serializer/lib/src/util.dart @@ -0,0 +1,5 @@ +part of '../json_serializer.dart'; + +bool _isPrimitive(value) { + return value is num || value is bool || value is String || value == null; +} diff --git a/common/json_serializer/lib/src/validation.dart b/common/json_serializer/lib/src/validation.dart new file mode 100644 index 0000000..8bc5c02 --- /dev/null +++ b/common/json_serializer/lib/src/validation.dart @@ -0,0 +1,25 @@ +part of '../json_serializer.dart'; + +/// Thrown when schema validation fails. +class JsonValidationError implements Exception { + //final Schema schema; + final dynamic invalidData; + final String cause; + + const JsonValidationError( + this.cause, this.invalidData); //, Schema this.schema); +} + +/// Specifies a schema to validate a class with. +class WithSchema { + final Map schema; + + const WithSchema(this.schema); +} + +/// Specifies a schema to validate a class with. +class WithSchemaUrl { + final String schemaUrl; + + const WithSchemaUrl(this.schemaUrl); +} diff --git a/common/json_serializer/pubspec.lock b/common/json_serializer/pubspec.lock new file mode 100644 index 0000000..a5bed97 --- /dev/null +++ b/common/json_serializer/pubspec.lock @@ -0,0 +1,402 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: "direct main" + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: "direct dev" + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/json_serializer/pubspec.yaml b/common/json_serializer/pubspec.yaml new file mode 100644 index 0000000..4b903df --- /dev/null +++ b/common/json_serializer/pubspec.yaml @@ -0,0 +1,12 @@ +name: platform_json_serializer +version: 7.2.0 +description: Easy JSON to Object serialization and deserialization in Dart. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/json_serializer +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + logging: ^1.0.1 +dev_dependencies: + stack_trace: ^1.10.0 + test: ^1.24.0 + lints: ^4.0.0 \ No newline at end of file diff --git a/common/json_serializer/test/deserialization_test.dart b/common/json_serializer/test/deserialization_test.dart new file mode 100644 index 0000000..1aabd6f --- /dev/null +++ b/common/json_serializer/test/deserialization_test.dart @@ -0,0 +1,115 @@ +import 'package:platform_json_serializer/json_serializer.dart' as god; +import 'package:test/test.dart'; +import 'shared.dart'; + +main() { + god.logger.onRecord.listen(printRecord); + + group('deserialization', () { + test('deserialize primitives', testDeserializationOfPrimitives); + + test('deserialize maps', testDeserializationOfMaps); + + test('deserialize maps + reflection', + testDeserializationOfMapsWithReflection); + + test('deserialize lists + reflection', + testDeserializationOfListsAsWellAsViaReflection); + + test('deserialize with schema validation', + testDeserializationWithSchemaValidation); + }); +} + +testDeserializationOfPrimitives() { + expect(god.deserialize('1'), equals(1)); + expect(god.deserialize('1.4'), equals(1.4)); + expect(god.deserialize('"Hi!"'), equals("Hi!")); + expect(god.deserialize("true"), equals(true)); + expect(god.deserialize("null"), equals(null)); +} + +testDeserializationOfMaps() { + String simpleJson = + '{"hello":"world", "one": 1, "class": {"hello": "world"}}'; + String nestedJson = + '{"foo": {"bar": "baz", "funny": {"how": "life", "seems": 2, "hate": "us sometimes"}}}'; + var simple = god.deserialize(simpleJson) as Map; + var nested = god.deserialize(nestedJson) as Map; + + expect(simple['hello'], equals('world')); + expect(simple['one'], equals(1)); + expect(simple['class']['hello'], equals('world')); + + expect(nested['foo']['bar'], equals('baz')); + expect(nested['foo']['funny']['how'], equals('life')); + expect(nested['foo']['funny']['seems'], equals(2)); + expect(nested['foo']['funny']['hate'], equals('us sometimes')); +} + +class Pokedex { + Map? pokemon; +} + +testDeserializationOfMapsWithReflection() { + var s = '{"pokemon": {"Bulbasaur": 1, "Deoxys": 382}}'; + var pokedex = god.deserialize(s, outputType: Pokedex) as Pokedex; + expect(pokedex.pokemon, hasLength(2)); + expect(pokedex.pokemon!['Bulbasaur'], 1); + expect(pokedex.pokemon!['Deoxys'], 382); +} + +testDeserializationOfListsAsWellAsViaReflection() { + String json = '''[ + { + "hello": "world", + "nested": [] + }, + { + "hello": "dolly", + "nested": [ + { + "bar": "baz" + }, + { + "bar": "fight" + } + ] + } + ] + '''; + + var list = god.deserialize(json, outputType: ([]).runtimeType) + as List; + SampleClass first = list[0]; + SampleClass second = list[1]; + + expect(list.length, equals(2)); + expect(first.hello, equals("world")); + expect(first.nested.length, equals(0)); + expect(second.hello, equals("dolly")); + expect(second.nested.length, equals(2)); + + SampleNestedClass firstNested = second.nested[0]; + SampleNestedClass secondNested = second.nested[1]; + + expect(firstNested.bar, equals("baz")); + expect(secondNested.bar, equals("fight")); +} + +testDeserializationWithSchemaValidation() async { + String babelRcJson = + '{"presets":["es2015","stage-0"],"plugins":["add-module-exports"]}'; + + var deserialized = + god.deserialize(babelRcJson, outputType: BabelRc) as BabelRc; + + print(deserialized.presets.runtimeType); + //expect(deserialized.presets is List, equals(true)); + expect(deserialized.presets.length, equals(2)); + expect(deserialized.presets[0], equals('es2015')); + expect(deserialized.presets[1], equals('stage-0')); + //expect(deserialized.plugins is List, equals(true)); + expect(deserialized.plugins.length, equals(1)); + expect(deserialized.plugins[0], equals('add-module-exports')); +} diff --git a/common/json_serializer/test/serialization_test.dart b/common/json_serializer/test/serialization_test.dart new file mode 100644 index 0000000..5f48e48 --- /dev/null +++ b/common/json_serializer/test/serialization_test.dart @@ -0,0 +1,133 @@ +//import 'package:dart2_constant/convert.dart'; +import 'dart:convert'; + +import 'package:platform_json_serializer/json_serializer.dart' as god; +import 'package:test/test.dart'; +import 'shared.dart'; + +main() { + god.logger.onRecord.listen(printRecord); + + group('serialization', () { + test('serialize primitives', testSerializationOfPrimitives); + + test('serialize dates', testSerializationOfDates); + + test('serialize maps', testSerializationOfMaps); + + test('serialize lists', testSerializationOfLists); + + test('serialize via reflection', testSerializationViaReflection); + + test('serialize with schema validation', + testSerializationWithSchemaValidation); + }); +} + +testSerializationOfPrimitives() { + expect(god.serialize(1), equals("1")); + expect(god.serialize(1.4), equals("1.4")); + expect(god.serialize("Hi!"), equals('"Hi!"')); + expect(god.serialize(true), equals("true")); + expect(god.serialize(null), equals("null")); +} + +testSerializationOfDates() { + DateTime date = DateTime.now(); + String s = god.serialize({'date': date}); + + print(s); + + var deserialized = json.decode(s); + expect(deserialized['date'], equals(date.toIso8601String())); +} + +testSerializationOfMaps() { + var simple = json.decode(god + .serialize({'hello': 'world', 'one': 1, 'class': SampleClass('world')})); + var nested = json.decode(god.serialize({ + 'foo': { + 'bar': 'baz', + 'funny': {'how': 'life', 'seems': 2, 'hate': 'us sometimes'} + } + })); + + expect(simple['hello'], equals('world')); + expect(simple['one'], equals(1)); + expect(simple['class']['hello'], equals('world')); + + expect(nested['foo']['bar'], equals('baz')); + expect(nested['foo']['funny']['how'], equals('life')); + expect(nested['foo']['funny']['seems'], equals(2)); + expect(nested['foo']['funny']['hate'], equals('us sometimes')); +} + +testSerializationOfLists() { + List pandorasBox = [ + 1, + "2", + {"num": 3, "four": SampleClass('five')}, + SampleClass('six')..nested.add(SampleNestedClass('seven')) + ]; + String s = god.serialize(pandorasBox); + print(s); + + var deserialized = json.decode(s); + + expect(deserialized is List, equals(true)); + expect(deserialized.length, equals(4)); + expect(deserialized[0], equals(1)); + expect(deserialized[1], equals("2")); + expect(deserialized[2] is Map, equals(true)); + expect(deserialized[2]['num'], equals(3)); + expect(deserialized[2]['four'] is Map, equals(true)); + expect(deserialized[2]['four']['hello'], equals('five')); + expect(deserialized[3] is Map, equals(true)); + expect(deserialized[3]['hello'], equals('six')); + expect(deserialized[3]['nested'] is List, equals(true)); + expect(deserialized[3]['nested'].length, equals(1)); + expect(deserialized[3]['nested'][0] is Map, equals(true)); + expect(deserialized[3]['nested'][0]['bar'], equals('seven')); +} + +testSerializationViaReflection() { + SampleClass sample = SampleClass('world'); + + for (int i = 0; i < 3; i++) { + sample.nested.add(SampleNestedClass('baz')); + } + + String s = god.serialize(sample); + print(s); + + var deserialized = json.decode(s); + expect(deserialized['hello'], equals('world')); + expect(deserialized['nested'] is List, equals(true)); + expect(deserialized['nested'].length == 3, equals(true)); + expect(deserialized['nested'][0]['bar'], equals('baz')); + expect(deserialized['nested'][1]['bar'], equals('baz')); + expect(deserialized['nested'][2]['bar'], equals('baz')); +} + +testSerializationWithSchemaValidation() async { + BabelRc babelRc = + BabelRc(presets: ['es2015', 'stage-0'], plugins: ['add-module-exports']); + + String s = god.serialize(babelRc); + print(s); + + var deserialized = json.decode(s); + + expect(deserialized['presets'] is List, equals(true)); + expect(deserialized['presets'].length, equals(2)); + expect(deserialized['presets'][0], equals('es2015')); + expect(deserialized['presets'][1], equals('stage-0')); + expect(deserialized['plugins'] is List, equals(true)); + expect(deserialized['plugins'].length, equals(1)); + expect(deserialized['plugins'][0], equals('add-module-exports')); + + //Map babelRc2 = {'presets': 'Hello, world!'}; + + String json2 = god.serialize(babelRc); + print(json2); +} diff --git a/common/json_serializer/test/shared.dart b/common/json_serializer/test/shared.dart new file mode 100644 index 0000000..20c096a --- /dev/null +++ b/common/json_serializer/test/shared.dart @@ -0,0 +1,49 @@ +import 'package:logging/logging.dart'; +import 'package:platform_json_serializer/json_serializer.dart'; +import 'package:stack_trace/stack_trace.dart'; + +void printRecord(LogRecord rec) { + print(rec); + if (rec.error != null) print(rec.error); + if (rec.stackTrace != null) print(Chain.forTrace(rec.stackTrace!).terse); +} + +class SampleNestedClass { + String? bar; + + SampleNestedClass([this.bar]); +} + +class SampleClass { + String? hello; + List nested = []; + + SampleClass([this.hello]); +} + +@WithSchemaUrl( + "http://raw.githubusercontent.com/SchemaStore/schemastore/master/src/schemas/json/babelrc.json") +class BabelRc { + List presets; + List plugins; + + BabelRc({this.presets = const [], this.plugins = const []}); +} + +@WithSchema({ + r"$schema": "http://json-schema.org/draft-04/schema#", + "title": "Validated Sample Class", + "description": "Sample schema for validation via JSON God", + "type": "object", + "hello": {"description": "A friendly greeting.", "type": "string"}, + "nested": { + "description": "A list of NestedSampleClass items within this instance.", + "type": "array", + "items": { + "type": "object", + "bar": {"description": "Filler text", "type": "string"} + } + }, + "required": ["hello", "nested"] +}) +class ValidatedSampleClass {} diff --git a/common/json_serializer/test/to_json_test.dart b/common/json_serializer/test/to_json_test.dart new file mode 100644 index 0000000..126f1f1 --- /dev/null +++ b/common/json_serializer/test/to_json_test.dart @@ -0,0 +1,32 @@ +import 'package:platform_json_serializer/json_serializer.dart' as god; +import 'package:test/test.dart'; +import 'shared.dart'; + +main() { + god.logger.onRecord.listen(printRecord); + + test('fromJson', () { + var foo = god.deserialize('{"bar":"baz"}', outputType: Foo) as Foo; + + //expect(foo is Foo, true); + expect(foo.text, equals('baz')); + }); + + test('toJson', () { + var foo = Foo(text: 'baz'); + var data = god.serializeObject(foo); + expect(data, equals({'bar': 'baz', 'foo': 'poobaz'})); + }); +} + +class Foo { + String? text; + + String get foo => 'poo$text'; + + Foo({this.text}); + + factory Foo.fromJson(Map json) => Foo(text: json['bar'].toString()); + + Map toJson() => {'bar': text, 'foo': foo}; +} diff --git a/common/merge_map/AUTHORS.md b/common/merge_map/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/merge_map/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/merge_map/CHANGELOG.md b/common/merge_map/CHANGELOG.md new file mode 100644 index 0000000..47830ad --- /dev/null +++ b/common/merge_map/CHANGELOG.md @@ -0,0 +1,58 @@ +# Change Log + +## 5.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 5.1.0 + +* Updated `lints` to 3.0.0 + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 + +## 4.0.0 + +* Require Dart >= 2.17 + +## 3.0.2 + +* Fixed license link + +## 3.0.1 + +* Updated README + +## 3.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_merge_map` package +* Fixed linter warnings + +## 2.0.2 + +* Resolve static analysis warnings + +## 2.0.1 + +* Updated README + +## 2.0.0 + +* Migrated to work with Dart SDK 2.12.x NNBD + +## 1.0.2 + +* Add an example, for Pub's sake. + +## 1.0.1 + +* Add a specific constraint on Dart versions, to prevent Pub from rejecting all packages that depend on +`merge_map` (the entire Angel framework). +* Add generic type support diff --git a/common/merge_map/LICENSE b/common/merge_map/LICENSE new file mode 100644 index 0000000..df5e063 --- /dev/null +++ b/common/merge_map/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/common/merge_map/README.md b/common/merge_map/README.md new file mode 100644 index 0000000..2bc5961 --- /dev/null +++ b/common/merge_map/README.md @@ -0,0 +1,25 @@ +# Belatuk Merge Map + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_merge_map?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/merge_map/LICENSE) + +**Replacement of `package:merge_map` with breaking changes to support NNBD.** + +Combine multiple Maps into one. Equivalent to +[Object.assign](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/assign) in JS. + +## Example + +```dart +import "package:platform_merge_map/platform_merge_map.dart"; + +void main() { + Map map1 = {'hello': 'world'}; + Map map2 = {'foo': {'bar': 'baz', 'this': 'will be overwritten'}}; + Map map3 = {'foo': {'john': 'doe', 'this': 'overrides previous maps'}}; + Map merged = mergeMap(map1, map2, map3); + + // {hello: world, foo: {bar: baz, john: doe, this: overrides previous maps}} +} +``` diff --git a/common/merge_map/analysis_options.yaml b/common/merge_map/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/merge_map/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/merge_map/example/main.dart b/common/merge_map/example/main.dart new file mode 100644 index 0000000..64521a8 --- /dev/null +++ b/common/merge_map/example/main.dart @@ -0,0 +1,20 @@ +import 'package:platform_merge_map/merge_map.dart'; + +void main() { + // ignore: omit_local_variable_types + Map map1 = {'hello': 'world'}; + + // ignore: omit_local_variable_types + Map map2 = { + 'foo': {'bar': 'baz', 'this': 'will be overwritten'} + }; + + // ignore: omit_local_variable_types + Map map3 = { + 'foo': {'john': 'doe', 'this': 'overrides previous maps'} + }; + var merged = mergeMap([map1, map2, map3]); + print(merged); + + // {hello: world, foo: {bar: baz, john: doe, this: overrides previous maps}} +} diff --git a/common/merge_map/lib/merge_map.dart b/common/merge_map/lib/merge_map.dart new file mode 100644 index 0000000..a709908 --- /dev/null +++ b/common/merge_map/lib/merge_map.dart @@ -0,0 +1,36 @@ +/// Exposes the [mergeMap] function, which... merges Maps. +library angel3_merge_map; + +dynamic _copyValues( + Map from, Map to, bool recursive, bool acceptNull) { + for (var key in from.keys) { + if (from[key] is Map && recursive) { + if (to[key] is! Map) { + to[key] = {} as V; + } + _copyValues(from[key] as Map, to[key] as Map, recursive, acceptNull); + } else { + if (from[key] != null || acceptNull) { + to[key] = from[key]; + } + } + } +} + +/// Merges the values of the given maps together. +/// +/// `recursive` is set to `true` by default. If set to `true`, +/// then nested maps will also be merged. Otherwise, nested maps +/// will overwrite others. +/// +/// `acceptNull` is set to `false` by default. If set to `false`, +/// then if the value on a map is `null`, it will be ignored, and +/// that `null` will not be copied. +Map mergeMap(Iterable> maps, + {bool recursive = true, bool acceptNull = false}) { + var result = {}; + for (var map in maps) { + _copyValues(map, result, recursive, acceptNull); + } + return result; +} diff --git a/common/merge_map/pubspec.lock b/common/merge_map/pubspec.lock new file mode 100644 index 0000000..19e27ee --- /dev/null +++ b/common/merge_map/pubspec.lock @@ -0,0 +1,402 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/merge_map/pubspec.yaml b/common/merge_map/pubspec.yaml new file mode 100644 index 0000000..4ef6d7a --- /dev/null +++ b/common/merge_map/pubspec.yaml @@ -0,0 +1,9 @@ +name: platform_merge_map +version: 5.2.0 +description: Combine multiple Maps into one. Equivalent to Object.assign in JS. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/merge_map +environment: + sdk: '>=3.3.0 <4.0.0' +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/merge_map/test/all_test.dart b/common/merge_map/test/all_test.dart new file mode 100644 index 0000000..d32ad44 --- /dev/null +++ b/common/merge_map/test/all_test.dart @@ -0,0 +1,104 @@ +import 'package:platform_merge_map/merge_map.dart'; +import 'package:test/test.dart'; + +void main() { + test('can merge two simple maps', () { + var merged = mergeMap([ + {'hello': 'world'}, + {'hello': 'dolly'} + ]); + expect(merged['hello'], equals('dolly')); + }); + + test("the last map's values supersede those of prior", () { + var merged = mergeMap([ + {'letter': 'a'}, + {'letter': 'b'}, + {'letter': 'c'} + ]); + expect(merged['letter'], equals('c')); + }); + + test('can merge two once-nested maps', () { + // ignore: omit_local_variable_types + Map map1 = { + 'hello': 'world', + 'foo': {'nested': false} + }; + // ignore: omit_local_variable_types + Map map2 = { + 'goodbye': 'sad life', + 'foo': {'nested': true, 'it': 'works'} + }; + var merged = mergeMap([map1, map2]); + + expect(merged['hello'], equals('world')); + expect(merged['goodbye'], equals('sad life')); + expect(merged['foo']['nested'], equals(true)); + expect(merged['foo']['it'], equals('works')); + }); + + test('once-nested map supersession', () { + // ignore: omit_local_variable_types + Map map1 = { + 'hello': 'world', + 'foo': {'nested': false} + }; + // ignore: omit_local_variable_types + Map map2 = { + 'goodbye': 'sad life', + 'foo': {'nested': true, 'it': 'works'} + }; + // ignore: omit_local_variable_types + Map map3 = { + 'foo': {'nested': 'supersession'} + }; + + var merged = mergeMap([map1, map2, map3]); + expect(merged['foo']['nested'], equals('supersession')); + }); + + test('can merge two twice-nested maps', () { + // ignore: omit_local_variable_types + Map map1 = { + 'a': { + 'b': {'c': 'd'} + } + }; + // ignore: omit_local_variable_types + Map map2 = { + 'a': { + 'b': {'c': 'D', 'e': 'f'} + } + }; + var merged = mergeMap([map1, map2]); + + expect(merged['a']['b']['c'], equals('D')); + expect(merged['a']['b']['e'], equals('f')); + }); + + test('twice-nested map supersession', () { + // ignore: omit_local_variable_types + Map map1 = { + 'a': { + 'b': {'c': 'd'} + } + }; + // ignore: omit_local_variable_types + Map map2 = { + 'a': { + 'b': {'c': 'D', 'e': 'f'} + } + }; + // ignore: omit_local_variable_types + Map map3 = { + 'a': { + 'b': {'e': 'supersession'} + } + }; + var merged = mergeMap([map1, map2, map3]); + + expect(merged['a']['b']['c'], equals('D')); + expect(merged['a']['b']['e'], equals('supersession')); + }); +} diff --git a/common/pretty_logging/AUTHORS.md b/common/pretty_logging/AUTHORS.md new file mode 100644 index 0000000..8563156 --- /dev/null +++ b/common/pretty_logging/AUTHORS.md @@ -0,0 +1,11 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/pretty_logging/CHANGELOG.md b/common/pretty_logging/CHANGELOG.md new file mode 100644 index 0000000..8ea9d05 --- /dev/null +++ b/common/pretty_logging/CHANGELOG.md @@ -0,0 +1,52 @@ +# Change Log + +## 6.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 6.1.0 + +* Updated `lints` to 3.0.0 + +## 6.0.0 + +* Require Dart >= 3.0 + +## 6.0.0-beta.1 + +* Require Dart >= 3.0 + +## 5.0.0 + +* Require Dart >= 2.17 + +## 4.0.0 + +* Added `lints` linter +* Removed deprecated parameters +* Published as `platform_pretty_logging` package + +## 3.0.3 + +* Updated README + +## 3.0.2 + +* Updated README + +## 3.0.1 + +* Fixed invalid homepage url in pubspec.yaml + +## 3.0.0 + +* Migrated to support Dart SDK 2.12.x NNBD + +## 2.0.0 + +* Migrated to work with Dart SDK 2.12.x Non NNBD + +## 1.0.0 + +* Initial release. diff --git a/common/pretty_logging/LICENSE b/common/pretty_logging/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/pretty_logging/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/pretty_logging/README.md b/common/pretty_logging/README.md new file mode 100644 index 0000000..3e4e6ba --- /dev/null +++ b/common/pretty_logging/README.md @@ -0,0 +1,41 @@ +# Belatuk Petty Logging + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_pretty_logging?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![Gitter](https://img.shields.io/gitter/room/angel_dart/discussion)](https://gitter.im/angel_dart/discussion) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/pretty_logging/LICENSE) + +**Replacement of `package:pretty_logging` with breaking changes to support NNBD.** + +Standalone helper for colorful logging output, using pkg:io AnsiCode. + +## Installation + +In your `pubspec.yaml`: + +```yaml +dependencies: + platform_pretty_logging: ^6.1.0 +``` + +## Usage + +Basic usage is very simple: + +```dart +myLogger.onRecord.listen(prettyLog); +``` + +However, you can conditionally pass logic to omit printing an error, provide colors, or to provide a custom print function: + +```dart +var pretty = prettyLog( + logColorChooser: (_) => red, + printFunction: stderr.writeln, + omitError: (r) { + var err = r.error; + return err is AngelHttpException && err.statusCode != 500; + }, +); +myLogger.onRecord.listen(pretty); +``` diff --git a/common/pretty_logging/analysis_options.yaml b/common/pretty_logging/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/pretty_logging/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/pretty_logging/example/main.dart b/common/pretty_logging/example/main.dart new file mode 100644 index 0000000..f14a936 --- /dev/null +++ b/common/pretty_logging/example/main.dart @@ -0,0 +1,11 @@ +import 'package:logging/logging.dart'; +import 'package:platform_pretty_logging/pretty_logging.dart'; + +void main() { + Logger.root + ..level = Level.ALL + ..onRecord.listen(prettyLog) + ..info('Hey!') + ..finest('Bye!') + ..severe('Oops!', StateError('Wrong!'), StackTrace.current); +} diff --git a/common/pretty_logging/lib/pretty_logging.dart b/common/pretty_logging/lib/pretty_logging.dart new file mode 100644 index 0000000..0c2c75b --- /dev/null +++ b/common/pretty_logging/lib/pretty_logging.dart @@ -0,0 +1,52 @@ +import 'package:logging/logging.dart'; +import 'package:io/ansi.dart'; + +/// Prints the contents of a [LogRecord] with pretty colors. +/// +/// By passing [omitError], you can omit printing the error of a given +/// [LogRecord]. +/// +/// You can also pass a custom [printFunction] or [logColorChooser]. +void prettyLog(LogRecord record, + {bool Function(LogRecord)? omitError, + void Function(String)? printFunction, + AnsiCode Function(Level)? logColorChooser}) { + logColorChooser ??= chooseLogColor; + omitError ??= (_) => false; + printFunction ??= print; + + var code = logColorChooser(record.level); + if (record.error == null) { + printFunction(code.wrap(record.toString()) ?? ""); + } + + if (record.error != null) { + var err = record.error; + if (omitError(record)) return; + printFunction(code.wrap('$record\n') ?? ""); + printFunction(code.wrap(err.toString()) ?? ""); + + if (record.stackTrace != null) { + printFunction(code.wrap(record.stackTrace.toString()) ?? ""); + } + } +} + +/// Chooses a color based on the logger [level]. +AnsiCode chooseLogColor(Level level) { + if (level == Level.SHOUT) { + return backgroundRed; + } else if (level == Level.SEVERE) { + return red; + } else if (level == Level.WARNING) { + return yellow; + } else if (level == Level.INFO) { + return cyan; + } else if (level == Level.CONFIG || + level == Level.FINE || + level == Level.FINER || + level == Level.FINEST) { + return lightGray; + } + return resetAll; +} diff --git a/common/pretty_logging/pubspec.lock b/common/pretty_logging/pubspec.lock new file mode 100644 index 0000000..4e94b31 --- /dev/null +++ b/common/pretty_logging/pubspec.lock @@ -0,0 +1,402 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: "direct main" + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: "direct main" + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/pretty_logging/pubspec.yaml b/common/pretty_logging/pubspec.yaml new file mode 100644 index 0000000..57e6446 --- /dev/null +++ b/common/pretty_logging/pubspec.yaml @@ -0,0 +1,12 @@ +name: platform_pretty_logging +version: 6.2.0 +description: Standalone helper for colorful logging output, using pkg:io AnsiCode. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/pretty_logging +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + io: ^1.0.0 + logging: ^1.0.1 +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/pretty_logging/test/all_test.dart b/common/pretty_logging/test/all_test.dart new file mode 100644 index 0000000..917f186 --- /dev/null +++ b/common/pretty_logging/test/all_test.dart @@ -0,0 +1,8 @@ +import 'package:test/test.dart'; + +void main() { + test('test', () { + var message = "Testing"; + expect(message, equals('Testing')); + }); +} diff --git a/common/pub_sub/AUTHORS.md b/common/pub_sub/AUTHORS.md new file mode 100644 index 0000000..6ae218e --- /dev/null +++ b/common/pub_sub/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/pub_sub/CHANGELOG.md b/common/pub_sub/CHANGELOG.md new file mode 100644 index 0000000..e0cf010 --- /dev/null +++ b/common/pub_sub/CHANGELOG.md @@ -0,0 +1,74 @@ +# Change Log + +## 6.3.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 6.2.0 + +* Updated `lints` to 3.0.0 +* Refactored encode/decode message handling into `MessageHandler` + +## 6.1.0 + +* Updated `uuid` to 4.0.0 + +## 6.0.0 + +* Require Dart >= 3.0 + +## 6.0.0-beta.1 + +* Require Dart >= 3.0 + +## 5.0.0 + +* Require Dart >= 2.17 + +## 4.0.3 + +* Fixed license link + +## 4.0.2 + +* Updated README + +## 4.0.1 + +* Updated README + +## 4.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_pub_sub` package + +## 3.0.2 + +* Resolved static analysis warnings + +## 3.0.1 + +* Resolved static analysis warnings + +## 3.0.0 + +* Migrated to work with Dart SDK 2.12.x NNBD + +## 2.3.0 + +* Allow `2.x` versions of `stream_channel`. +* Apply `package:pedantic` lints. + +## 2.2.0 + +* Upgrade `uuid`. + +## 2.1.0 + +* Allow for "trusted clients," which are implicitly-registered clients. +This makes using `package:pub_sub` easier, as well making it easier to scale. + +## 2.0.0 + +* Dart 2 updates. diff --git a/common/pub_sub/LICENSE b/common/pub_sub/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/pub_sub/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/pub_sub/README.md b/common/pub_sub/README.md new file mode 100644 index 0000000..19069b0 --- /dev/null +++ b/common/pub_sub/README.md @@ -0,0 +1,212 @@ +# Belatuk Pub Sub + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_pub_sub?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/pub_sub/LICENSE) + +**Replacement of `package:pub_sub` with breaking changes to support NNBD.** + +Keep application instances in sync with a simple pub/sub API. + +## Installation + +Add `platform_pub_sub` as a dependency in your `pubspec.yaml` file: + +```yaml +dependencies: + platform_pub_sub: ^6.2.0 +``` + +Then, be sure to run `dart pub get` in your terminal. + +## Usage + +`platform_pub_sub` is your typical pub/sub API. However, `platform_pub_sub` enforces authentication of every +request. It is very possible that `platform_pub_sub` will run on both server and in the browser, +or on a platform like Flutter. + +**Be careful to not leak any `platform_pub_sub` client ID's if operating over a network.** +If you do, you run the risk of malicious users injecting events into your application. + +A `platform_pub_sub` server can operate across multiple *adapters*, which take care of interfacing data over different media. For example, a single server can handle pub/sub between multiple Isolates and TCP Sockets, as well as WebSockets, simultaneously. + +```dart +import 'package:platform_pub_sub/platform_pub_sub.dart' as pub_sub; + +main() async { + var server = pub_sub.Server([ + FooAdapter(...), + BarAdapter(...) + ]); + + server.addAdapter( BazAdapter(...)); + + // Call `start` to activate adapters, and begin handling requests. + server.start(); +} +``` + +### Trusted Clients + +You can use `package:platform_pub_sub` without explicitly registering clients, *if and only if* those clients come from trusted sources. Clients via `Isolate` are always trusted. Clients via `package:json_rpc_2` must be explicitly marked as trusted (i.e. using an IP whitelist mechanism): + +```dart +JsonRpc2Adapter(..., isTrusted: false); + +// Pass `null` as Client ID when trusted... +pub_sub.IsolateClient(null); +``` + +### Access Control + +The ID's of all *untrusted* clients who will connect to the server must be known at start-up time. +You may not register new clients after the server has started. This is mostly a security consideration, to make it impossible to register new clients, thus preventing malicious users from granting themselves additional privileges within the system. + +```dart +import 'package:platform_pub_sub/platform_pub_sub.dart' as pub_sub; + +void main() async { + // ... + server.registerClient(const ClientInfo('')); + + // Create a user who can subscribe, but not publish. + server.registerClient(const ClientInfo('', canPublish: false)); + + // Create a user who can publish, but not subscribe. + server.registerClient(const ClientInfo('', canSubscribe: false)); + + // Create a user with no privileges whatsoever. + server.registerClient(const ClientInfo('', canPublish: false, canSubscribe: false)); + + server.start(); +} +``` + +### Isolates + +If you are just running multiple instances of a server, use `package:platform_pub_sub/isolate.dart`. You'll need one isolate to be the master. Typically this is the first isolate you create. + +```dart +import 'dart:io'; +import 'dart:isolate'; +import 'package:platform_pub_sub/isolate.dart' as pub_sub; +import 'package:platform_pub_sub/platform_pub_sub.dart' as pub_sub; + +void main() async { + // Easily bring up a server. + var adapter = pub_sub.IsolateAdapter(); + var server = pub_sub.Server([adapter]); + + // You then need to create a client that will connect to the adapter. + // Each isolate in your application should contain a client. + for (int i = 0; i < Platform.numberOfProcessors - 1; i++) { + server.registerClient(pub_sub.ClientInfo('client$i')); + } + + // Start the server. + server.start(); + + // Next, let's start isolates that interact with the server. + // + // Fortunately, we can send SendPorts over Isolates, so this is no hassle. + for (int i = 0; i < Platform.numberOfProcessors - 1; i++) + Isolate.spawn(isolateMain, [i, adapter.receivePort.sendPort]); + + // It's possible that you're running your application in the server isolate as well: + isolateMain([0, adapter.receivePort.sendPort]); +} + +void isolateMain(List args) { + var client = + pub_sub.IsolateClient('client${args[0]}', args[1] as SendPort); + + // The client will connect automatically. In the meantime, we can start subscribing to events. + client.subscribe('user::logged_in').then((sub) { + // The `ClientSubscription` class extends `Stream`. Hooray for asynchrony! + sub.listen((msg) { + print('Logged in: $msg'); + }); + }); +} + +``` + +### JSON RPC 2.0 + +If you are not running on isolates, you need to import +`package:platform_pub_sub/json_rpc_2.dart`. This library leverages `package:json_rpc_2` and +`package:stream_channel` to create clients and servers that can hypothetically run on any +medium, i.e. WebSockets, or TCP Sockets. + +Check out `test/json_rpc_2_test.dart` for an example of serving `platform_pub_sub` over TCP sockets. + +## Protocol + +`platform_pub_sub` is built upon a simple RPC, and this package includes an implementation that runs via `SendPort`s and `ReceivePort`s, as well as one that runs on any `StreamChannel`. + +Data sent over the wire looks like the following: + +```typescript +// Sent by a client to initiate an exchange. +interface Request { + // This is an arbitrary string, assigned by your client, but in every case, + // the client uses this to match your requests with asynchronous responses. + request_id: string, + + // The ID of the client to authenticate as. + // + // As you can imagine, this should be kept secret, to prevent breaches. + client_id: string, + + // Required for *every* request. + params: { + // A value to be `publish`ed. + value?: any, + + // The name of an event to `publish`. + event_name?: string, + + // The ID of a subscription to be cancelled. + subscription_id?: string + } +} + +/// Sent by the server in response to a request. +interface Response { + // `true` for success, `false` for failures. + status: boolean, + + // Only appears if `status` is `false`; explains why an operation failed. + error_message?: string, + + // Matches the request_id sent by the client. + request_id: string, + + result?: { + // The number of other clients to whom an event was `publish`ed. + listeners:? number, + + // The ID of a created subscription. + subscription_id?: string + } +} +``` + +When sending via JSON_RPC 2.0, the `params` of a `Request` are simply folded into the object +itself, for simplicity's sake. In this case, a response will be sent as a notification whose +name is the `request_id`. + +In the case of Isolate clients/servers, events will be simply sent as Lists: + +```dart +['', value] +``` + +Clients can send with the following 3 methods: + +* `subscribe` (`event_name`:string): Subscribe to an event. +* `unsubscribe` (`subscription_id`:string): Unsubscribe from an event you previously subscribed to. +* `publish` (`event_name`:string, `value`:any): Publish an event to all other clients who are subscribed. + +The client and server in `package:platform_pub_sub/isolate.dart` must make extra provisions to keep track of client ID's. Since `SendPort`s and `ReceivePort`s do not have any sort of guaranteed-unique ID's, new clients must send their `SendPort` to the server before sending any requests. The server then responds +with an `id` that must be used to identify a `SendPort` to send a response to. diff --git a/common/pub_sub/analysis_options.yaml b/common/pub_sub/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/pub_sub/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/pub_sub/example/main.dart b/common/pub_sub/example/main.dart new file mode 100644 index 0000000..58e0fbb --- /dev/null +++ b/common/pub_sub/example/main.dart @@ -0,0 +1,46 @@ +import 'dart:io'; +import 'dart:isolate'; +import 'package:platform_pub_sub/isolate.dart'; +import 'package:platform_pub_sub/pub_sub.dart'; + +void main() async { + // Easily bring up a server. + var adapter = IsolateAdapter(); + var server = Server([adapter]); + + // You then need to create a client that will connect to the adapter. + // Every untrusted client in your application should be pre-registered. + // + // In the case of Isolates, however, those are always implicitly trusted. + print("Register Client"); + for (var i = 0; i < Platform.numberOfProcessors - 1; i++) { + server.registerClient(ClientInfo('client$i')); + } + + // Start the server. + server.start(); + + // Next, let's start isolates that interact with the server. + // + // Fortunately, we can send SendPorts over Isolates, so this is no hassle. + print("Create Isolate"); + for (var i = 0; i < Platform.numberOfProcessors - 1; i++) { + await Isolate.spawn(isolateMain, [i, adapter.receivePort.sendPort]); + } + + // It's possible that you're running your application in the server isolate as well: + isolateMain([0, adapter.receivePort.sendPort]); +} + +void isolateMain(List args) { + // Isolates are always trusted, so technically we don't need to pass a client iD. + var client = IsolateClient('client${args[0]}', args[1] as SendPort); + + // The client will connect automatically. In the meantime, we can start subscribing to events. + client.subscribe('user::logged_in').then((sub) { + // The `ClientSubscription` class extends `Stream`. Hooray for asynchrony! + sub.listen((msg) { + print('Logged in: $msg'); + }); + }); +} diff --git a/common/pub_sub/lib/isolate.dart b/common/pub_sub/lib/isolate.dart new file mode 100644 index 0000000..0fcf44b --- /dev/null +++ b/common/pub_sub/lib/isolate.dart @@ -0,0 +1,2 @@ +export 'src/isolate/client.dart'; +export 'src/isolate/server.dart'; diff --git a/common/pub_sub/lib/json_rpc_2.dart b/common/pub_sub/lib/json_rpc_2.dart new file mode 100644 index 0000000..41bd3b0 --- /dev/null +++ b/common/pub_sub/lib/json_rpc_2.dart @@ -0,0 +1,2 @@ +export 'src/json_rpc/client.dart'; +export 'src/json_rpc/server.dart'; diff --git a/common/pub_sub/lib/pub_sub.dart b/common/pub_sub/lib/pub_sub.dart new file mode 100644 index 0000000..8450547 --- /dev/null +++ b/common/pub_sub/lib/pub_sub.dart @@ -0,0 +1 @@ +export 'src/protocol/protocol.dart'; diff --git a/common/pub_sub/lib/src/isolate/client.dart b/common/pub_sub/lib/src/isolate/client.dart new file mode 100644 index 0000000..cd1ce29 --- /dev/null +++ b/common/pub_sub/lib/src/isolate/client.dart @@ -0,0 +1,184 @@ +import 'dart:async'; +import 'dart:collection'; +import 'dart:isolate'; +import 'package:uuid/uuid.dart'; + +import '../protocol/protocol.dart'; +import 'shared.dart'; + +/// A [Client] implementation that communicates via [SendPort]s and [ReceivePort]s. +class IsolateClient extends Client { + final Queue> _onConnect = Queue>(); + final Map> _requests = {}; + final List<_IsolateClientSubscription> _subscriptions = []; + final Uuid _uuid = Uuid(); + + String? _id; + + /// The ID of the client we are authenticating as. + /// + /// May be `null`, if and only if we are marked as a trusted source on + /// the server side. + String? get clientId => _clientId; + String? _clientId; + + /// A server's [SendPort] that messages should be sent to. + final SendPort serverSendPort; + + /// A [ReceivePort] that receives messages from the server. + final ReceivePort receivePort = ReceivePort(); + + IsolateClient(String? clientId, this.serverSendPort) { + _clientId = clientId; + receivePort.listen((data) { + if (data is Map) { + var (status, id, requestId, result, errorMessage) = + MessageHandler().decodeResponseMessage(data); + + if (requestId != null) { + //var requestId = data['request_id'] as String?; + var c = _requests.remove(requestId); + + if (c != null && !c.isCompleted) { + //if (data['status'] is! bool) { + // c.completeError( + // FormatException('The server sent an invalid response.')); + //} else if (!(data['status'] as bool)) { + if (!status) { + c.completeError(PubSubException(errorMessage ?? + 'The server sent a failure response, but did not provide an error message.')); + } else if (result is! Map) { + c.completeError(FormatException( + 'The server sent a success response, but did not include a result.')); + } else { + c.complete(result); + } + } + } else if (id != null && _id == null) { + _id = id; + + for (var c in _onConnect) { + if (!c.isCompleted) c.complete(_id); + } + + _onConnect.clear(); + } + } else if (data is List) { + var eventName = data[0] as String; + var event = data[1]; + for (var s in _subscriptions.where((s) => s.eventName == eventName)) { + if (!s._stream.isClosed) s._stream.add(event); + } + } + }); + serverSendPort.send(receivePort.sendPort); + } + + Future _whenConnected(FutureOr Function() callback) { + if (_id != null) { + return Future.sync(callback); + } else { + var c = Completer(); + _onConnect.add(c); + return c.future.then((_) => callback()); + } + } + + @override + Future publish(String eventName, value) { + return _whenConnected(() { + var c = Completer(); + var requestId = _uuid.v4(); + _requests[requestId] = c; + serverSendPort.send(MessageHandler().encodePublishRequestMessage( + _id, requestId, clientId, eventName, value)); + + return c.future.then((result) { + var (_, clientId) = MessageHandler() + .decodePublishResponseMessage(result as Map); + _clientId = clientId; + }); + }); + } + + @override + Future subscribe(String eventName) { + return _whenConnected(() { + var c = Completer(); + var requestId = _uuid.v4(); + _requests[requestId] = c; + serverSendPort.send(MessageHandler().encodeSubscriptionRequestMessage( + _id, requestId, clientId, eventName)); + + return c.future.then((result) { + var (subcriptionId, clientId) = MessageHandler() + .decodeSubscriptionResponseMessage(result as Map); + _clientId = clientId; + var s = _IsolateClientSubscription(eventName, subcriptionId, this); + _subscriptions.add(s); + return s; + }); + }); + } + + @override + Future close() { + receivePort.close(); + + for (var c in _onConnect) { + if (!c.isCompleted) { + c.completeError(StateError( + 'The client was closed before the server ever accepted the connection.')); + } + } + + for (var c in _requests.values) { + if (!c.isCompleted) { + c.completeError(StateError( + 'The client was closed before the server responded to this request.')); + } + } + + for (var s in _subscriptions) { + s._close(); + } + + _requests.clear(); + return Future.value(); + } +} + +class _IsolateClientSubscription extends ClientSubscription { + final StreamController _stream = StreamController(); + final String? eventName, id; + final IsolateClient client; + + _IsolateClientSubscription(this.eventName, this.id, this.client); + + void _close() { + if (!_stream.isClosed) _stream.close(); + } + + @override + StreamSubscription listen(void Function(dynamic event)? onData, + {Function? onError, void Function()? onDone, bool? cancelOnError}) { + return _stream.stream.listen(onData, + onError: onError, onDone: onDone, cancelOnError: cancelOnError); + } + + @override + Future unsubscribe() { + return client._whenConnected(() { + var c = Completer(); + var requestId = client._uuid.v4(); + client._requests[requestId] = c; + client.serverSendPort.send(MessageHandler() + .encodeUnsubscriptionRequestMessage( + client._id, requestId, client.clientId, id)); + + return c.future.then((result) { + _close(); + }); + }); + } +} diff --git a/common/pub_sub/lib/src/isolate/server.dart b/common/pub_sub/lib/src/isolate/server.dart new file mode 100644 index 0000000..95c9f79 --- /dev/null +++ b/common/pub_sub/lib/src/isolate/server.dart @@ -0,0 +1,206 @@ +import 'dart:async'; +import 'dart:isolate'; +import 'package:uuid/uuid.dart'; +import '../protocol/protocol.dart'; +import 'shared.dart'; + +/// A [Adapter] implementation that communicates via [SendPort]s and [ReceivePort]s. +class IsolateAdapter extends Adapter { + final Map _clients = {}; + final StreamController _onPublish = + StreamController(); + final StreamController _onSubscribe = + StreamController(); + final StreamController _onUnsubscribe = + StreamController(); + final Uuid _uuid = Uuid(); + + /// A [ReceivePort] on which to listen for incoming data. + final ReceivePort receivePort = ReceivePort(); + + @override + Stream get onPublish => _onPublish.stream; + + @override + Stream get onSubscribe => _onSubscribe.stream; + + @override + Stream get onUnsubscribe => _onUnsubscribe.stream; + + @override + Future close() { + receivePort.close(); + _clients.clear(); + _onPublish.close(); + _onSubscribe.close(); + _onUnsubscribe.close(); + return Future.value(); + } + + @override + void start() { + receivePort.listen((data) { + if (data is SendPort) { + var id = _uuid.v4(); + _clients[id] = data; + data.send(MessageHandler().encodeSendPortResponseMessage(id)); + } else if (data is Map) { + var (id, method, requestId, params) = + MessageHandler().decodeRequestMessage(data); + var (clientId, eventName, subscriptionId, value) = + MessageHandler().decodeRequestParams(params); + + var sp = _clients[id]; + if (sp == null) { + // There's nobody to respond to, so don't send anything to anyone + return; + } + + if (method == 'publish') { + if (eventName == null || value == null) { + sp.send(MessageHandler().encodePublishResponseError(requestId)); + } + var rq = _IsolatePublishRequestImpl( + requestId, clientId, eventName, value, sp); + _onPublish.add(rq); + } else if (method == 'subscribe') { + if (eventName == null) { + sp.send( + MessageHandler().encodeSubscriptionResponseError(requestId)); + } + var rq = _IsolateSubscriptionRequestImpl( + clientId, eventName, sp, requestId, _uuid); + _onSubscribe.add(rq); + } else if (method == 'unsubscribe') { + if (subscriptionId == null) { + sp.send( + MessageHandler().encodeUnsubscriptionResponseError(requestId)); + } + var rq = _IsolateUnsubscriptionRequestImpl( + clientId, subscriptionId, sp, requestId); + _onUnsubscribe.add(rq); + } else { + sp.send(MessageHandler() + .encodeUnknownMethodResponseError(requestId, method)); + } + } + }); + } + + @override + bool isTrustedPublishRequest(PublishRequest request) { + // Isolate clients are considered trusted, because they are + // running in the same process as the central server. + return true; + } + + @override + bool isTrustedSubscriptionRequest(SubscriptionRequest request) { + return true; + } +} + +class _IsolatePublishRequestImpl extends PublishRequest { + @override + final String? clientId; + + @override + final String? eventName; + + @override + final Object? value; + + final SendPort sendPort; + + final String? requestId; + + _IsolatePublishRequestImpl( + this.requestId, this.clientId, this.eventName, this.value, this.sendPort); + + @override + void reject(String errorMessage) { + sendPort.send(MessageHandler() + .encodePublishResponseError(requestId, errorMessage: errorMessage)); + } + + @override + void accept(PublishResponse response) { + sendPort.send(MessageHandler().encodePublishResponseMessage2( + requestId, response.listeners, response.clientId)); + } +} + +class _IsolateSubscriptionRequestImpl extends SubscriptionRequest { + @override + final String? clientId; + + @override + final String? eventName; + + final SendPort sendPort; + + final String? requestId; + + final Uuid _uuid; + + _IsolateSubscriptionRequestImpl( + this.clientId, this.eventName, this.sendPort, this.requestId, this._uuid); + + @override + void reject(String errorMessage) { + sendPort.send(MessageHandler().encodeSubscriptionResponseError(requestId, + errorMessage: errorMessage)); + } + + @override + FutureOr accept(String? clientId) { + var id = _uuid.v4(); + sendPort.send(MessageHandler() + .encodeSubscriptionResponseMessage(requestId, id, clientId)); + return _IsolateSubscriptionImpl(clientId, id, eventName, sendPort); + } +} + +class _IsolateSubscriptionImpl extends Subscription { + @override + final String? clientId, id; + + final String? eventName; + + final SendPort sendPort; + + _IsolateSubscriptionImpl( + this.clientId, this.id, this.eventName, this.sendPort); + + @override + void dispatch(event) { + sendPort.send([eventName, event]); + } +} + +class _IsolateUnsubscriptionRequestImpl extends UnsubscriptionRequest { + @override + final String? clientId; + + @override + final String? subscriptionId; + + final SendPort sendPort; + + final String? requestId; + + _IsolateUnsubscriptionRequestImpl( + this.clientId, this.subscriptionId, this.sendPort, this.requestId); + + @override + void reject(String errorMessage) { + sendPort.send(MessageHandler().encodeUnsubscriptionResponseError(requestId, + errorMessage: errorMessage)); + } + + @override + void accept() { + sendPort + .send(MessageHandler().encodeUnsubscriptionResponseMessage(requestId)); + } +} diff --git a/common/pub_sub/lib/src/isolate/shared.dart b/common/pub_sub/lib/src/isolate/shared.dart new file mode 100644 index 0000000..9751b81 --- /dev/null +++ b/common/pub_sub/lib/src/isolate/shared.dart @@ -0,0 +1,183 @@ +/// A message handler class that handles the encoding/decoding of messages send +/// between isolate [Client] and [Server]. +class MessageHandler { + static const _requestId = 'request_id'; + static const _method = 'method'; + static const _clientId = 'client_id'; + static const _eventName = 'event_name'; + static const _subscriptionId = 'subscription_id'; + static const _errorMessage = 'error_message'; + static const _value = 'value'; + static const _id = 'id'; + static const _params = 'params'; + static const _status = 'status'; + static const _result = 'result'; + static const _listeners = 'listeners'; + + static const _publishErrorMsg = 'Expected client_id, event_name, and value'; + static const _subscribeErrorMsg = 'Expected client_id, and event_name'; + static const _unsubscribeErrorMsg = 'Expected client_id, and subscription_id'; + + const MessageHandler(); + + Map encodePublishResponseError(String? requestId, + {String errorMessage = _publishErrorMsg}) { + return _encodeResponseError(requestId, errorMessage); + } + + Map encodeSubscriptionResponseError(String? requestId, + {String errorMessage = _subscribeErrorMsg}) { + return _encodeResponseError(requestId, errorMessage); + } + + Map encodeUnsubscriptionResponseError(String? requestId, + {String errorMessage = _unsubscribeErrorMsg}) { + return _encodeResponseError(requestId, errorMessage); + } + + Map encodeUnknownMethodResponseError( + String? requestId, String method) { + var unknownMethodErrorMsg = + 'Unrecognized method "$method" or you have omitted id, request_id, method, or params'; + + return _encodeResponseError(requestId, unknownMethodErrorMsg); + } + + Map _encodeResponseError(String? requestId, String message) { + return { + _status: false, + _requestId: requestId ?? '', + _errorMessage: message + }; + } + + Map encodeEventMessage(String? requestId, Object message) { + return {_status: true, _requestId: requestId ?? '', _result: message}; + } + + Map encodeSubscriptionResponseMessage( + String? requestId, String? subscriptionId, String? clientId) { + return { + _status: true, + _requestId: requestId ?? '', + _result: {_subscriptionId: subscriptionId, _clientId: clientId} + }; + } + + (String?, String?) decodeSubscriptionResponseMessage( + Map message) { + var subscriptionId = message[_subscriptionId] as String?; + var clientId = message[_clientId] as String?; + + return (subscriptionId, clientId); + } + + Map encodeUnsubscriptionResponseMessage(String? requestId) { + return {_status: true, _requestId: requestId, _result: {}}; + } + + (bool, String?, Object?, String?) decodeUnsubscriptionResponseMessage( + Map message) { + var status = message[_status] as bool? ?? false; + var requestId = message[_requestId] as String?; + var result = message[_result]; + var errorMessage = message[_errorMessage] as String?; + + return (status, requestId, result, errorMessage); + } + + Map encodePublishResponseMessage2( + String? requestId, int listeners, String? clientId) { + return { + _status: true, + _requestId: requestId, + _result: {_listeners: listeners, _clientId: clientId} + }; + } + + (int, String?) decodePublishResponseMessage(Map message) { + var listeners = message[_listeners] as int; + var clientId = message[_clientId] as String?; + + return (listeners, clientId); + } + + Map encodePublishResponseMessage(String? id, + String? requestId, String? clientId, String? eventName, Object? value) { + return { + _id: id, + _requestId: requestId, + _method: 'publish', + _params: {_clientId: clientId, _eventName: eventName, _value: value} + }; + } + + Map encodeResponseMessage( + String? requestId, Object message) { + return {_status: true, _requestId: requestId ?? '', _result: message}; + } + + (bool, String?, String?, Object?, String?) decodeResponseMessage( + Map message) { + var id = message[_id] as String?; + var status = message[_status] as bool? ?? false; + var requestId = message[_requestId] as String?; + var result = message[_result]; + var errorMessage = message[_errorMessage] as String?; + + return (status, id, requestId, result, errorMessage); + } + + (String, String, String, Map) decodeRequestMessage( + Map message) { + var id = message[_id] as String? ?? ''; + var method = message[_method] as String? ?? ''; + var requestId = message[_requestId] as String? ?? ''; + var params = message[_params] as Map? ?? {}; + + return (id, method, requestId, params); + } + + Map encodeSubscriptionRequestMessage( + String? id, String? requestId, String? clientId, String? eventName) { + return { + _id: id, + _requestId: requestId, + _method: 'subscribe', + _params: {_clientId: clientId, _eventName: eventName} + }; + } + + Map encodeUnsubscriptionRequestMessage( + String? id, String? requestId, String? clientId, String? subscriptionId) { + return { + _id: id, + _requestId: requestId, + _method: 'unsubscribe', + _params: {_clientId: clientId, _subscriptionId: subscriptionId} + }; + } + + Map encodePublishRequestMessage(String? id, + String? requestId, String? clientId, String? eventName, Object? value) { + return { + _id: id, + _requestId: requestId, + _method: 'publish', + _params: {_clientId: clientId, _eventName: eventName, _value: value} + }; + } + + (String?, String?, String?, Object?) decodeRequestParams( + Map params) { + var clientId = params[_clientId] as String?; + var eventName = params[_eventName] as String?; + var value = params[_value]; + var subscriptionId = params[_subscriptionId] as String?; + return (clientId, eventName, subscriptionId, value); + } + + Map encodeSendPortResponseMessage(String id) { + return {_status: true, _id: id}; + } +} diff --git a/common/pub_sub/lib/src/json_rpc/client.dart b/common/pub_sub/lib/src/json_rpc/client.dart new file mode 100644 index 0000000..9dab0ff --- /dev/null +++ b/common/pub_sub/lib/src/json_rpc/client.dart @@ -0,0 +1,145 @@ +import 'dart:async'; +import 'package:stream_channel/stream_channel.dart'; +import 'package:json_rpc_2/json_rpc_2.dart' as json_rpc_2; +import 'package:uuid/uuid.dart'; +import '../../pub_sub.dart'; + +/// A [Client] implementation that communicates via JSON RPC 2.0. +class JsonRpc2Client extends Client { + final Map> _requests = {}; + final List<_JsonRpc2ClientSubscription> _subscriptions = []; + final Uuid _uuid = Uuid(); + + json_rpc_2.Peer? _peer; + + /// The ID of the client we are authenticating as. + /// + /// May be `null`, if and only if we are marked as a trusted source on + /// the server side. + String? get clientId => _clientId; + String? _clientId; + + JsonRpc2Client(String? clientId, StreamChannel channel) { + _clientId = clientId; + _peer = json_rpc_2.Peer(channel); + + _peer!.registerMethod('event', (json_rpc_2.Parameters params) { + String? eventName = params['event_name'].asString; + var event = params['value'].value; + for (var s in _subscriptions.where((s) => s.eventName == eventName)) { + if (!s._stream.isClosed) s._stream.add(event); + } + }); + + _peer!.registerFallback((json_rpc_2.Parameters params) { + var c = _requests.remove(params.method); + + if (c == null) { + throw json_rpc_2.RpcException.methodNotFound(params.method); + } else { + var data = params.asMap; + + if (data['status'] is! bool) { + c.completeError( + FormatException('The server sent an invalid response.')); + } else if (!(data['status'] as bool)) { + c.completeError(PubSubException(data['error_message']?.toString() ?? + 'The server sent a failure response, but did not provide an error message.')); + } else { + c.complete(data); + } + } + }); + + _peer!.listen(); + } + + @override + Future publish(String eventName, value) { + var c = Completer(); + var requestId = _uuid.v4(); + _requests[requestId] = c; + _peer!.sendNotification('publish', { + 'request_id': requestId, + 'client_id': clientId, + 'event_name': eventName, + 'value': value + }); + return c.future.then((data) { + _clientId = data['result']['client_id'] as String?; + }); + } + + @override + Future subscribe(String eventName) { + var c = Completer(); + var requestId = _uuid.v4(); + _requests[requestId] = c; + _peer!.sendNotification('subscribe', { + 'request_id': requestId, + 'client_id': clientId, + 'event_name': eventName + }); + return c.future.then((result) { + _clientId = result['client_id'] as String?; + var s = _JsonRpc2ClientSubscription( + eventName, result['subscription_id'] as String?, this); + _subscriptions.add(s); + return s; + }); + } + + @override + Future close() { + if (_peer?.isClosed != true) _peer!.close(); + + for (var c in _requests.values) { + if (!c.isCompleted) { + c.completeError(StateError( + 'The client was closed before the server responded to this request.')); + } + } + + for (var s in _subscriptions) { + s._close(); + } + + _requests.clear(); + return Future.value(); + } +} + +class _JsonRpc2ClientSubscription extends ClientSubscription { + final StreamController _stream = StreamController(); + final String? eventName, id; + final JsonRpc2Client client; + + _JsonRpc2ClientSubscription(this.eventName, this.id, this.client); + + void _close() { + if (!_stream.isClosed) _stream.close(); + } + + @override + StreamSubscription listen(void Function(dynamic event)? onData, + {Function? onError, void Function()? onDone, bool? cancelOnError}) { + return _stream.stream.listen(onData, + onError: onError, onDone: onDone, cancelOnError: cancelOnError); + } + + @override + Future unsubscribe() { + var c = Completer(); + var requestId = client._uuid.v4(); + client._requests[requestId] = c; + client._peer!.sendNotification('unsubscribe', { + 'request_id': requestId, + 'client_id': client.clientId, + 'subscription_id': id + }); + + return c.future.then((_) { + _close(); + }); + } +} diff --git a/common/pub_sub/lib/src/json_rpc/server.dart b/common/pub_sub/lib/src/json_rpc/server.dart new file mode 100644 index 0000000..70391c0 --- /dev/null +++ b/common/pub_sub/lib/src/json_rpc/server.dart @@ -0,0 +1,220 @@ +import 'dart:async'; +import 'package:stream_channel/stream_channel.dart'; +import 'package:json_rpc_2/json_rpc_2.dart' as json_rpc_2; +import 'package:uuid/uuid.dart'; +import '../../pub_sub.dart'; + +/// A [Adapter] implementation that communicates via JSON RPC 2.0. +class JsonRpc2Adapter extends Adapter { + final StreamController _onPublish = + StreamController(); + final StreamController _onSubscribe = + StreamController(); + final StreamController _onUnsubscribe = + StreamController(); + + final List _peers = []; + final Uuid _uuid = Uuid(); + + json_rpc_2.Peer? _peer; + + /// A [Stream] of incoming clients, who can both send and receive string data. + final Stream> clientStream; + + /// If `true`, clients can connect through this endpoint, *without* providing a client ID. + /// + /// This can be a security vulnerability if you don't know what you're doing. + /// If you *must* use this over the Internet, use an IP whitelist. + final bool isTrusted; + + JsonRpc2Adapter(this.clientStream, {this.isTrusted = false}); + + @override + Stream get onPublish => _onPublish.stream; + + @override + Stream get onSubscribe => _onSubscribe.stream; + + @override + Stream get onUnsubscribe => _onUnsubscribe.stream; + + @override + Future close() { + if (_peer?.isClosed != true) _peer?.close(); + + Future.wait(_peers.where((s) => !s.isClosed).map((s) => s.close())) + .then((_) => _peers.clear()); + return Future.value(); + } + + String? _getClientId(json_rpc_2.Parameters params) { + try { + return params['client_id'].asString; + } catch (_) { + return null; + } + } + + @override + void start() { + clientStream.listen((client) { + var peer = _peer = json_rpc_2.Peer(client); + + peer.registerMethod('publish', (json_rpc_2.Parameters params) async { + var requestId = params['request_id'].asString; + var clientId = _getClientId(params); + var eventName = params['event_name'].asString; + var value = params['value'].value; + var rq = _JsonRpc2PublishRequestImpl( + requestId, clientId, eventName, value, peer); + _onPublish.add(rq); + }); + + peer.registerMethod('subscribe', (json_rpc_2.Parameters params) async { + var requestId = params['request_id'].asString; + var clientId = _getClientId(params); + var eventName = params['event_name'].asString; + var rq = _JsonRpc2SubscriptionRequestImpl( + clientId, eventName, requestId, peer, _uuid); + _onSubscribe.add(rq); + }); + + peer.registerMethod('unsubscribe', (json_rpc_2.Parameters params) async { + var requestId = params['request_id'].asString; + var clientId = _getClientId(params); + var subscriptionId = params['subscription_id'].asString; + var rq = _JsonRpc2UnsubscriptionRequestImpl( + clientId, subscriptionId, peer, requestId); + _onUnsubscribe.add(rq); + }); + + peer.listen(); + }); + } + + @override + bool isTrustedPublishRequest(PublishRequest request) { + return isTrusted; + } + + @override + bool isTrustedSubscriptionRequest(SubscriptionRequest request) { + return isTrusted; + } +} + +class _JsonRpc2PublishRequestImpl extends PublishRequest { + final String requestId; + + @override + final String? clientId, eventName; + + @override + final dynamic value; + + final json_rpc_2.Peer peer; + + _JsonRpc2PublishRequestImpl( + this.requestId, this.clientId, this.eventName, this.value, this.peer); + + @override + void accept(PublishResponse response) { + peer.sendNotification(requestId, { + 'status': true, + 'request_id': requestId, + 'result': { + 'listeners': response.listeners, + 'client_id': response.clientId + } + }); + } + + @override + void reject(String errorMessage) { + peer.sendNotification(requestId, { + 'status': false, + 'request_id': requestId, + 'error_message': errorMessage + }); + } +} + +class _JsonRpc2SubscriptionRequestImpl extends SubscriptionRequest { + @override + final String? clientId, eventName; + + final String requestId; + + final json_rpc_2.Peer peer; + + final Uuid _uuid; + + _JsonRpc2SubscriptionRequestImpl( + this.clientId, this.eventName, this.requestId, this.peer, this._uuid); + + @override + FutureOr accept(String? clientId) { + var id = _uuid.v4(); + peer.sendNotification(requestId, { + 'status': true, + 'request_id': requestId, + 'subscription_id': id, + 'client_id': clientId + }); + return _JsonRpc2SubscriptionImpl(clientId, id, eventName, peer); + } + + @override + void reject(String errorMessage) { + peer.sendNotification(requestId, { + 'status': false, + 'request_id': requestId, + 'error_message': errorMessage + }); + } +} + +class _JsonRpc2SubscriptionImpl extends Subscription { + @override + final String? clientId, id; + + final String? eventName; + + final json_rpc_2.Peer peer; + + _JsonRpc2SubscriptionImpl(this.clientId, this.id, this.eventName, this.peer); + + @override + void dispatch(event) { + peer.sendNotification('event', {'event_name': eventName, 'value': event}); + } +} + +class _JsonRpc2UnsubscriptionRequestImpl extends UnsubscriptionRequest { + @override + final String? clientId; + + @override + final String subscriptionId; + + final json_rpc_2.Peer peer; + + final String requestId; + + _JsonRpc2UnsubscriptionRequestImpl( + this.clientId, this.subscriptionId, this.peer, this.requestId); + + @override + void accept() { + peer.sendNotification(requestId, {'status': true, 'result': {}}); + } + + @override + void reject(String errorMessage) { + peer.sendNotification(requestId, { + 'status': false, + 'request_id': requestId, + 'error_message': errorMessage + }); + } +} diff --git a/common/pub_sub/lib/src/protocol/client/client.dart b/common/pub_sub/lib/src/protocol/client/client.dart new file mode 100644 index 0000000..4a32716 --- /dev/null +++ b/common/pub_sub/lib/src/protocol/client/client.dart @@ -0,0 +1,30 @@ +import 'dart:async'; + +/// Queries a `pub_sub` server. +abstract class Client { + /// Publishes an event to the server. + Future publish(String eventName, value); + + /// Request a [ClientSubscription] to the desired [eventName] from the server. + Future subscribe(String eventName); + + /// Disposes of this client. + Future close(); +} + +/// A client-side implementation of a subscription, which acts as a [Stream], and can be cancelled easily. +abstract class ClientSubscription extends Stream { + /// Stops listening for new events, and instructs the server to cancel the subscription. + Future unsubscribe(); +} + +/// Thrown as the result of an invalid request, or an attempt to perform an action without the correct privileges. +class PubSubException implements Exception { + /// The error message sent by the server. + final String message; + + const PubSubException(this.message); + + @override + String toString() => '`pub_sub` exception: $message'; +} diff --git a/common/pub_sub/lib/src/protocol/client/sync_client.dart b/common/pub_sub/lib/src/protocol/client/sync_client.dart new file mode 100644 index 0000000..93a3257 --- /dev/null +++ b/common/pub_sub/lib/src/protocol/client/sync_client.dart @@ -0,0 +1 @@ +export 'client.dart'; diff --git a/common/pub_sub/lib/src/protocol/protocol.dart b/common/pub_sub/lib/src/protocol/protocol.dart new file mode 100644 index 0000000..9bf74c6 --- /dev/null +++ b/common/pub_sub/lib/src/protocol/protocol.dart @@ -0,0 +1,2 @@ +export 'client/sync_client.dart'; +export 'server/sync_server.dart'; diff --git a/common/pub_sub/lib/src/protocol/server/adapter.dart b/common/pub_sub/lib/src/protocol/server/adapter.dart new file mode 100644 index 0000000..e129b4a --- /dev/null +++ b/common/pub_sub/lib/src/protocol/server/adapter.dart @@ -0,0 +1,29 @@ +import 'dart:async'; +import 'publish.dart'; +import 'subscription.dart'; + +/// Adapts an abstract medium to serve the `pub_sub` RPC protocol. +abstract class Adapter { + /// Determines if a given [request] comes from a trusted source. + /// + /// If so, the request does not have to provide a pre-established ID, + /// and instead will be assigned one. + bool isTrustedPublishRequest(PublishRequest request); + + bool isTrustedSubscriptionRequest(SubscriptionRequest request); + + /// Fires an event whenever a client tries to publish data. + Stream get onPublish; + + /// Fires whenever a client tries to subscribe to an event. + Stream get onSubscribe; + + /// Fires whenever a client cancels a subscription. + Stream get onUnsubscribe; + + /// Disposes of this adapter. + Future close(); + + /// Start listening for incoming clients. + void start(); +} diff --git a/common/pub_sub/lib/src/protocol/server/client.dart b/common/pub_sub/lib/src/protocol/server/client.dart new file mode 100644 index 0000000..976e3fc --- /dev/null +++ b/common/pub_sub/lib/src/protocol/server/client.dart @@ -0,0 +1,14 @@ +/// Represents information about a client that will be accessing +/// this `angel_sync` server. +class ClientInfo { + /// A unique identifier for this client. + final String id; + + /// If `true` (default), then the client is allowed to publish events. + final bool canPublish; + + /// If `true` (default), then the client can subscribe to events. + final bool canSubscribe; + + const ClientInfo(this.id, {this.canPublish = true, this.canSubscribe = true}); +} diff --git a/common/pub_sub/lib/src/protocol/server/publish.dart b/common/pub_sub/lib/src/protocol/server/publish.dart new file mode 100644 index 0000000..8aa0d76 --- /dev/null +++ b/common/pub_sub/lib/src/protocol/server/publish.dart @@ -0,0 +1,28 @@ +/// Represents a request to publish information to other clients. +abstract class PublishRequest { + /// The ID of the client sending this request. + String? get clientId; + + /// The name of the event to be sent. + String? get eventName; + + /// The value to be published as an event. + Object? get value; + + /// Accept the request, with a response. + void accept(PublishResponse response); + + /// Deny the request with an error message. + void reject(String errorMessage); +} + +/// A response to a publish request. Informs the caller of how much clients received the event. +class PublishResponse { + /// The number of unique listeners to whom this event was propogated. + final int listeners; + + /// The client ID returned the server. Significant in cases where an ad-hoc client was registered. + final String? clientId; + + const PublishResponse(this.listeners, this.clientId); +} diff --git a/common/pub_sub/lib/src/protocol/server/server.dart b/common/pub_sub/lib/src/protocol/server/server.dart new file mode 100644 index 0000000..b0cc08d --- /dev/null +++ b/common/pub_sub/lib/src/protocol/server/server.dart @@ -0,0 +1,160 @@ +import 'dart:async'; +import 'dart:math'; +import 'adapter.dart'; +import 'client.dart'; +import 'package:collection/collection.dart' show IterableExtension; +import 'publish.dart'; +import 'subscription.dart'; + +/// A server that implements the `pub_sub` protocol. +/// +/// It can work using multiple [Adapter]s, to simultaneously +/// serve local and remote clients alike. +class Server { + final List _adapters = []; + final List _clients = []; + final _rnd = Random.secure(); + final Map> _subscriptions = {}; + bool _started = false; + int _adHocIds = 0; + + /// Initialize a server, optionally with a number of [adapters]. + Server([Iterable adapters = const []]) { + _adapters.addAll(adapters); + } + + /// Adds a new [Adapter] to adapt incoming clients from a new interface. + void addAdapter(Adapter adapter) { + if (_started) { + throw StateError( + 'You cannot add new adapters after the server has started listening.'); + } else { + _adapters.add(adapter); + } + } + + /// Registers a new client with the server. + void registerClient(ClientInfo client) { + if (_started) { + throw StateError( + 'You cannot register new clients after the server has started listening.'); + } else { + _clients.add(client); + } + } + + /// Disposes of this server, and closes all of its adapters. + Future close() { + Future.wait(_adapters.map((a) => a.close())); + _adapters.clear(); + _clients.clear(); + _subscriptions.clear(); + return Future.value(); + } + + String _newClientId() { + // Create an unpredictable-enough ID. The harder it is for an attacker to guess, the better. + var id = + 'pub_sub::adhoc_client${_rnd.nextDouble()}::${_adHocIds++}:${DateTime.now().millisecondsSinceEpoch * _rnd.nextDouble()}'; + + // This client is coming from a trusted source, and can therefore both publish and subscribe. + _clients.add(ClientInfo(id)); + return id; + } + + void start() { + if (_adapters.isEmpty) { + throw StateError( + 'Cannot start a SyncServer that has no adapters attached.'); + } else if (_started) { + throw StateError('A SyncServer may only be started once.'); + } + + _started = true; + + for (var adapter in _adapters) { + adapter.start(); + } + + for (var adapter in _adapters) { + // Handle publishes + adapter.onPublish.listen((rq) { + ClientInfo? client; + String? clientId; + + if (rq.clientId?.isNotEmpty == true || + adapter.isTrustedPublishRequest(rq)) { + clientId = + rq.clientId?.isNotEmpty == true ? rq.clientId : _newClientId(); + client = _clients.firstWhereOrNull((c) => c.id == clientId); + } + + if (client == null) { + rq.reject('Unrecognized client ID "${clientId ?? ''}".'); + } else if (!client.canPublish) { + rq.reject('You are not allowed to publish events.'); + } else { + var listeners = _subscriptions[rq.eventName] + ?.where((s) => s.clientId != clientId) ?? + []; + + if (listeners.isEmpty) { + rq.accept(PublishResponse(0, clientId)); + } else { + for (var listener in listeners) { + listener.dispatch(rq.value); + } + + rq.accept(PublishResponse(listeners.length, clientId)); + } + } + }); + + // Listen for incoming subscriptions + adapter.onSubscribe.listen((rq) async { + ClientInfo? client; + String? clientId; + + if (rq.clientId?.isNotEmpty == true || + adapter.isTrustedSubscriptionRequest(rq)) { + clientId = + rq.clientId?.isNotEmpty == true ? rq.clientId : _newClientId(); + client = _clients.firstWhereOrNull((c) => c.id == clientId); + } + + if (client == null) { + rq.reject('Unrecognized client ID "${clientId ?? ''}".'); + } else if (!client.canSubscribe) { + rq.reject('You are not allowed to subscribe to events.'); + } else { + var sub = await rq.accept(clientId); + var list = _subscriptions.putIfAbsent(rq.eventName, () => []); + list.add(sub); + } + }); + + // Unregister subscriptions on unsubscribe + adapter.onUnsubscribe.listen((rq) { + Subscription? toRemove; + late List sourceList; + + for (var list in _subscriptions.values) { + toRemove = list.firstWhereOrNull((s) => s.id == rq.subscriptionId); + if (toRemove != null) { + sourceList = list; + break; + } + } + + if (toRemove == null) { + rq.reject('The specified subscription does not exist.'); + } else if (toRemove.clientId != rq.clientId) { + rq.reject('That is not your subscription to cancel.'); + } else { + sourceList.remove(toRemove); + rq.accept(); + } + }); + } + } +} diff --git a/common/pub_sub/lib/src/protocol/server/subscription.dart b/common/pub_sub/lib/src/protocol/server/subscription.dart new file mode 100644 index 0000000..9f5db23 --- /dev/null +++ b/common/pub_sub/lib/src/protocol/server/subscription.dart @@ -0,0 +1,47 @@ +import 'dart:async'; + +/// Represents a request to subscribe to an event. +abstract class SubscriptionRequest { + /// The ID of the client requesting to subscribe. + String? get clientId; + + /// The name of the event the client wants to subscribe to. + String? get eventName; + + /// Accept the request, and grant the client access to subscribe to the event. + /// + /// Includes the client's ID, which is necessary for ad-hoc clients. + FutureOr accept(String? clientId); + + /// Deny the request with an error message. + void reject(String errorMessage); +} + +/// Represents a request to unsubscribe to an event. +abstract class UnsubscriptionRequest { + /// The ID of the client requesting to unsubscribe. + String? get clientId; + + /// The name of the event the client wants to unsubscribe from. + String? get subscriptionId; + + /// Accept the request. + FutureOr accept(); + + /// Deny the request with an error message. + void reject(String errorMessage); +} + +/// Represents a client's subscription to an event. +/// +/// Also provides a means to fire an event. +abstract class Subscription { + /// A unique identifier for this subscription. + String? get id; + + /// The ID of the client who requested this subscription. + String? get clientId; + + /// Alerts a client of an event. + void dispatch(event); +} diff --git a/common/pub_sub/lib/src/protocol/server/sync_server.dart b/common/pub_sub/lib/src/protocol/server/sync_server.dart new file mode 100644 index 0000000..5777b27 --- /dev/null +++ b/common/pub_sub/lib/src/protocol/server/sync_server.dart @@ -0,0 +1,5 @@ +export 'adapter.dart'; +export 'client.dart'; +export 'publish.dart'; +export 'server.dart'; +export 'subscription.dart'; diff --git a/common/pub_sub/pubspec.lock b/common/pub_sub/pubspec.lock new file mode 100644 index 0000000..d0004f1 --- /dev/null +++ b/common/pub_sub/pubspec.lock @@ -0,0 +1,434 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: "direct main" + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + fixnum: + dependency: transitive + description: + name: fixnum + sha256: b6dc7065e46c974bc7c5f143080a6764ec7a4be6da1285ececdc37be96de53be + url: "https://pub.dev" + source: hosted + version: "1.1.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + json_rpc_2: + dependency: "direct main" + description: + name: json_rpc_2 + sha256: "246b321532f0e8e2ba474b4d757eaa558ae4fdd0688fdbc1e1ca9705f9b8ca0e" + url: "https://pub.dev" + source: hosted + version: "3.0.3" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + sprintf: + dependency: transitive + description: + name: sprintf + sha256: "1fc9ffe69d4df602376b52949af107d8f5703b77cda567c4d7d86a0693120f23" + url: "https://pub.dev" + source: hosted + version: "7.0.0" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: "direct main" + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + uuid: + dependency: "direct main" + description: + name: uuid + sha256: a5be9ef6618a7ac1e964353ef476418026db906c4facdedaa299b7a2e71690ff + url: "https://pub.dev" + source: hosted + version: "4.5.1" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/pub_sub/pubspec.yaml b/common/pub_sub/pubspec.yaml new file mode 100644 index 0000000..2adc75b --- /dev/null +++ b/common/pub_sub/pubspec.yaml @@ -0,0 +1,14 @@ +name: platform_pub_sub +version: 6.3.0 +description: Keep application instances in sync with a simple pub/sub API. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/pub_sub +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + json_rpc_2: ^3.0.0 + stream_channel: ^2.1.0 + uuid: ^4.0.0 + collection: ^1.17.0 +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/pub_sub/test/isolate_test.dart b/common/pub_sub/test/isolate_test.dart new file mode 100644 index 0000000..1c720d8 --- /dev/null +++ b/common/pub_sub/test/isolate_test.dart @@ -0,0 +1,122 @@ +import 'dart:async'; +import 'package:platform_pub_sub/pub_sub.dart'; +import 'package:platform_pub_sub/isolate.dart'; +import 'package:test/test.dart'; + +void main() { + late Server server; + late Client client1, client2, client3; + late IsolateClient trustedClient; + late IsolateAdapter adapter; + + setUp(() async { + adapter = IsolateAdapter(); + client1 = + IsolateClient('isolate_test::secret', adapter.receivePort.sendPort); + client2 = + IsolateClient('isolate_test::secret2', adapter.receivePort.sendPort); + client3 = + IsolateClient('isolate_test::secret3', adapter.receivePort.sendPort); + trustedClient = IsolateClient(null, adapter.receivePort.sendPort); + + server = Server([adapter]) + ..registerClient(const ClientInfo('isolate_test::secret')) + ..registerClient(const ClientInfo('isolate_test::secret2')) + ..registerClient(const ClientInfo('isolate_test::secret3')) + ..registerClient( + const ClientInfo('isolate_test::no_publish', canPublish: false)) + ..registerClient( + const ClientInfo('isolate_test::no_subscribe', canSubscribe: false)) + ..start(); + + var sub = await client3.subscribe('foo'); + sub.listen((data) { + print('Client3 caught foo: $data'); + }); + }); + + tearDown(() { + Future.wait([ + server.close(), + client1.close(), + client2.close(), + client3.close(), + trustedClient.close() + ]); + }); + + group('trusted', () { + test('can publish', () async { + await trustedClient.publish('hey', 'bye'); + expect(trustedClient.clientId, isNotNull); + }); + test('can sub/unsub', () async { + String? clientId; + await trustedClient.publish('heyaaa', 'byeaa'); + expect(clientId = trustedClient.clientId, isNotNull); + + var sub = await trustedClient.subscribe('yeppp'); + expect(trustedClient.clientId, clientId); + + await sub.unsubscribe(); + expect(trustedClient.clientId, clientId); + }); + }); + + test('subscribers receive published events', () async { + var sub = await client2.subscribe('foo'); + await client1.publish('foo', 'bar'); + expect(await sub.first, 'bar'); + }); + + test('subscribers are not sent their own events', () async { + var sub = await client1.subscribe('foo'); + await client1.publish('foo', + ''); + await sub.unsubscribe(); + expect(await sub.isEmpty, isTrue); + }); + + test('can unsubscribe', () async { + var sub = await client2.subscribe('foo'); + await client1.publish('foo', 'bar'); + await sub.unsubscribe(); + await client1.publish('foo', ''); + expect(await sub.length, 1); + }); + + group('isolate_server', () { + test('reject unknown client id', () async { + try { + var client = IsolateClient( + 'isolate_test::invalid', adapter.receivePort.sendPort); + await client.publish('foo', 'bar'); + throw 'Invalid client ID\'s should throw an error, but they do not.'; + } on PubSubException catch (e) { + print('Expected exception was thrown: ${e.message}'); + } + }); + + test('reject unprivileged publish', () async { + try { + var client = IsolateClient( + 'isolate_test::no_publish', adapter.receivePort.sendPort); + await client.publish('foo', 'bar'); + throw 'Unprivileged publishes should throw an error, but they do not.'; + } on PubSubException catch (e) { + print('Expected exception was thrown: ${e.message}'); + } + }); + + test('reject unprivileged subscribe', () async { + try { + var client = IsolateClient( + 'isolate_test::no_subscribe', adapter.receivePort.sendPort); + await client.subscribe('foo'); + throw 'Unprivileged subscribes should throw an error, but they do not.'; + } on PubSubException catch (e) { + print('Expected exception was thrown: ${e.message}'); + } + }); + }); +} diff --git a/common/pub_sub/test/json_rpc_2_test.dart b/common/pub_sub/test/json_rpc_2_test.dart new file mode 100644 index 0000000..6f90e94 --- /dev/null +++ b/common/pub_sub/test/json_rpc_2_test.dart @@ -0,0 +1,187 @@ +import 'dart:async'; +import 'dart:convert'; +import 'dart:io'; +import 'package:platform_pub_sub/pub_sub.dart'; +import 'package:platform_pub_sub/json_rpc_2.dart'; +import 'package:stream_channel/stream_channel.dart'; +import 'package:test/test.dart'; + +void main() { + late ServerSocket serverSocket; + late Server server; + late Client client1, client2, client3; + late JsonRpc2Client trustedClient; + JsonRpc2Adapter adapter; + + setUp(() async { + serverSocket = await ServerSocket.bind(InternetAddress.loopbackIPv4, 0); + + adapter = JsonRpc2Adapter( + serverSocket.map>(streamSocket), + isTrusted: true); + + var socket1 = + await Socket.connect(InternetAddress.loopbackIPv4, serverSocket.port); + var socket2 = + await Socket.connect(InternetAddress.loopbackIPv4, serverSocket.port); + var socket3 = + await Socket.connect(InternetAddress.loopbackIPv4, serverSocket.port); + var socket4 = + await Socket.connect(InternetAddress.loopbackIPv4, serverSocket.port); + + client1 = JsonRpc2Client('json_rpc_2_test::secret', streamSocket(socket1)); + client2 = JsonRpc2Client('json_rpc_2_test::secret2', streamSocket(socket2)); + client3 = JsonRpc2Client('json_rpc_2_test::secret3', streamSocket(socket3)); + trustedClient = JsonRpc2Client(null, streamSocket(socket4)); + + server = Server([adapter]) + ..registerClient(const ClientInfo('json_rpc_2_test::secret')) + ..registerClient(const ClientInfo('json_rpc_2_test::secret2')) + ..registerClient(const ClientInfo('json_rpc_2_test::secret3')) + ..registerClient( + const ClientInfo('json_rpc_2_test::no_publish', canPublish: false)) + ..registerClient(const ClientInfo('json_rpc_2_test::no_subscribe', + canSubscribe: false)) + ..start(); + + var sub = await client3.subscribe('foo'); + sub.listen((data) { + print('Client3 caught foo: $data'); + }); + }); + + tearDown(() { + Future.wait( + [server.close(), client1.close(), client2.close(), client3.close()]); + }); + + group('trusted', () { + test('can publish', () async { + await trustedClient.publish('hey', 'bye'); + expect(trustedClient.clientId, isNotNull); + }); + test('can sub/unsub', () async { + String? clientId; + await trustedClient.publish('heyaaa', 'byeaa'); + expect(clientId = trustedClient.clientId, isNotNull); + + var sub = await trustedClient.subscribe('yeppp'); + expect(trustedClient.clientId, clientId); + + await sub.unsubscribe(); + expect(trustedClient.clientId, clientId); + }); + }); + + test('subscribers receive published events', () async { + var sub = await client2.subscribe('foo'); + await client1.publish('foo', 'bar'); + expect(await sub.first, 'bar'); + }); + + test('subscribers are not sent their own events', () async { + var sub = await client1.subscribe('foo'); + await client1.publish('foo', + ''); + await sub.unsubscribe(); + expect(await sub.isEmpty, isTrue); + }); + + test('can unsubscribe', () async { + var sub = await client2.subscribe('foo'); + await client1.publish('foo', 'bar'); + await sub.unsubscribe(); + await client1.publish('foo', ''); + expect(await sub.length, 1); + }); + + group('json_rpc_2_server', () { + test('reject unknown client id', () async { + try { + var sock = await Socket.connect( + InternetAddress.loopbackIPv4, serverSocket.port); + var client = + JsonRpc2Client('json_rpc_2_test::invalid', streamSocket(sock)); + await client.publish('foo', 'bar'); + throw 'Invalid client ID\'s should throw an error, but they do not.'; + } on PubSubException catch (e) { + print('Expected exception was thrown: ${e.message}'); + } + }); + + test('reject unprivileged publish', () async { + try { + var sock = await Socket.connect( + InternetAddress.loopbackIPv4, serverSocket.port); + var client = + JsonRpc2Client('json_rpc_2_test::no_publish', streamSocket(sock)); + await client.publish('foo', 'bar'); + throw 'Unprivileged publishes should throw an error, but they do not.'; + } on PubSubException catch (e) { + print('Expected exception was thrown: ${e.message}'); + } + }); + + test('reject unprivileged subscribe', () async { + try { + var sock = await Socket.connect( + InternetAddress.loopbackIPv4, serverSocket.port); + var client = + JsonRpc2Client('json_rpc_2_test::no_subscribe', streamSocket(sock)); + await client.subscribe('foo'); + throw 'Unprivileged subscribes should throw an error, but they do not.'; + } on PubSubException catch (e) { + print('Expected exception was thrown: ${e.message}'); + } + }); + }); +} + +StreamChannel streamSocket(Socket socket) { + var channel = _SocketStreamChannel(socket); + return channel + .cast>() + .transform(StreamChannelTransformer.fromCodec(utf8)); +} + +class _SocketStreamChannel extends StreamChannelMixin> { + _SocketSink? _sink; + final Socket socket; + + _SocketStreamChannel(this.socket); + + @override + StreamSink> get sink => _sink ??= _SocketSink(socket); + + @override + Stream> get stream => socket; +} + +class _SocketSink implements StreamSink> { + final Socket socket; + + _SocketSink(this.socket); + + @override + void add(List event) { + socket.add(event); + } + + @override + void addError(Object error, [StackTrace? stackTrace]) { + Zone.current.errorCallback(error, stackTrace); + } + + @override + Future addStream(Stream> stream) { + return socket.addStream(stream); + } + + @override + Future close() { + return socket.close(); + } + + @override + Future get done => socket.done; +} diff --git a/common/range_header/AUTHORS.md b/common/range_header/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/range_header/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/range_header/CHANGELOG.md b/common/range_header/CHANGELOG.md new file mode 100644 index 0000000..4b1fc14 --- /dev/null +++ b/common/range_header/CHANGELOG.md @@ -0,0 +1,65 @@ +# Change Log + +## 6.3.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 6.2.0 + +* Updated `lints` to 3.0.0 + +## 6.1.0 + +* Updated `file` to 7.0.0 + +## 6.0.0 + +* Require Dart >= 3.0 +* Updated all dependencies to latest + +## 6.0.0-beta.1 + +* Require Dart >= 3.0 + +## 5.0.0 + +* Require Dart >= 2.17 + +## 4.0.1 + +* Fixed license link + +## 4.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_range_header` package +* Fixed linter warnings + +## 3.0.2 + +* Updated README + +## 3.0.1 + +* Resolve static analysis warnings + +## 3.0.0 + +* Migrated to work with Dart SDK 2.12.x NNBD + +## 2.0.2 + +* Fix bug in `toContentRange` that printed invalid indices. +* Fold header items by default. + +## 2.0.1 + +* Adjust `RangeHeaderTransformer` to properly print the content range of each item, +when multiple are present. + +## 2.0.0 + +* Dart 2 update. +* Add `RangeHeaderTransformer`. +* Overall restructuring/refactoring. diff --git a/common/range_header/LICENSE b/common/range_header/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/range_header/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/range_header/README.md b/common/range_header/README.md new file mode 100644 index 0000000..7be08bb --- /dev/null +++ b/common/range_header/README.md @@ -0,0 +1,40 @@ +# Belatuk Range Header + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_range_header?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/range_header/LICENSE) + +**Replacement of `package:range_header` with breaking changes to support NNBD.** + +Range header parser for belatuk. Can be used by any dart backend. + +## Installation + +In your `pubspec.yaml`: + +```yaml +dependencies: + platform_range_header: ^6.2.0 +``` + +## Usage + +```dart +handleRequest(HttpRequest request) async { + // Parse the header + var header = RangeHeader.parse(request.headers.value(HttpHeaders.rangeHeader)); + + // Optimize/canonicalize it + var items = RangeHeader.foldItems(header.items); + header = RangeHeader(items); + + // Get info + header.items; + header.rangeUnit; + print(header.items[0].toContentRange(fileSize)); + + // Serve the file + var transformer = RangeHeaderTransformer(header); + await file.openRead().transform(transformer).pipe(request.response); +} +``` diff --git a/common/range_header/analysis_options.yaml b/common/range_header/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/range_header/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/range_header/example/main.dart b/common/range_header/example/main.dart new file mode 100644 index 0000000..dd5f7f8 --- /dev/null +++ b/common/range_header/example/main.dart @@ -0,0 +1,30 @@ +import 'dart:io'; +import 'package:platform_range_header/range_header.dart'; + +var file = File('some_video.mp4'); + +void handleRequest(HttpRequest request) async { + // Parse the header + var header = + RangeHeader.parse(request.headers.value(HttpHeaders.rangeHeader)!); + + // Optimize/canonicalize it + var items = RangeHeader.foldItems(header.items); + header = RangeHeader(items); + + // Get info + header.items; + header.rangeUnit; + for (var item in header.items) { + item.toContentRange(400); + } + + // Serve the file + var transformer = + RangeHeaderTransformer(header, 'video/mp4', await file.length()); + await file + .openRead() + .cast>() + .transform(transformer) + .pipe(request.response); +} diff --git a/common/range_header/example/server.dart b/common/range_header/example/server.dart new file mode 100644 index 0000000..076acd8 --- /dev/null +++ b/common/range_header/example/server.dart @@ -0,0 +1,103 @@ +//import 'package:angel_framework/angel_framework.dart'; +//import 'package:angel_framework/http.dart'; +//import 'package:angel_static/angel_static.dart'; + +void main() async { + /* + var app = new Angel(); + var http = new AngelHttp(app); + var fs = const LocalFileSystem(); + var vDir = new _RangingVirtualDirectory(app, fs.currentDirectory); + app.logger = new Logger('range_header') + ..onRecord.listen((rec) { + print(rec); + if (rec.error != null) print(rec.error); + if (rec.stackTrace != null) print(rec.stackTrace); + }); + app.mimeTypeResolver + ..addExtension('dart', 'text/dart') + ..addExtension('lock', 'text/plain') + ..addExtension('md', 'text/plain') + ..addExtension('packages', 'text/plain') + ..addExtension('yaml', 'text/plain') + ..addExtension('yml', 'text/plain'); + app.fallback(vDir.handleRequest); + app.fallback((req, res) => throw new AngelHttpException.notFound()); + await http.startServer('127.0.0.1', 3000); + print('Listening at ${http.uri}'); + */ +} +/* +class _RangingVirtualDirectory extends VirtualDirectory { + _RangingVirtualDirectory(Angel app, Directory source) + : super(app, source.fileSystem, + source: source, allowDirectoryListing: true); + + @override + Future serveFile( + File file, FileStat stat, RequestContext req, ResponseContext res) async { + res.headers[HttpHeaders.acceptRangesHeader] = 'bytes'; + + if (req.headers.value(HttpHeaders.rangeHeader)?.startsWith('bytes') == + true) { + var header = + new RangeHeader.parse(req.headers.value(HttpHeaders.rangeHeader)); + header = new RangeHeader(RangeHeader.foldItems(header.items)); + + if (header.items.length == 1) { + var item = header.items[0]; + Stream stream; + int len = 0, total = await file.length(); + + if (item.start == -1) { + if (item.end == -1) { + len = total; + stream = file.openRead(); + } else { + len = item.end + 1; + stream = file.openRead(0, item.end + 1); + } + } else { + if (item.end == -1) { + len = total - item.start; + stream = file.openRead(item.start); + } else { + len = item.end - item.start + 1; + stream = file.openRead(item.start, item.end + 1); + } + } + + res.contentType = new MediaType.parse( + app.mimeTypeResolver.lookup(file.path) ?? + 'application/octet-stream'); + res.statusCode = HttpStatus.partialContent; + res.headers[HttpHeaders.contentLengthHeader] = len.toString(); + res.headers[HttpHeaders.contentRangeHeader] = + 'bytes ' + item.toContentRange(total); + await stream.cast>().pipe(res); + return false; + } else { + var totalFileSize = await file.length(); + var transformer = new RangeHeaderTransformer( + header, + app.mimeTypeResolver.lookup(file.path) ?? + 'application/octet-stream', + await file.length()); + res.statusCode = HttpStatus.partialContent; + res.headers[HttpHeaders.contentLengthHeader] = + transformer.computeContentLength(totalFileSize).toString(); + res.contentType = new MediaType( + 'multipart', 'byteranges', {'boundary': transformer.boundary}); + await file + .openRead() + .cast>() + .transform(transformer) + .pipe(res); + return false; + } + } else { + return await super.serveFile(file, stat, req, res); + } + } +} +*/ diff --git a/common/range_header/lib/range_header.dart b/common/range_header/lib/range_header.dart new file mode 100644 index 0000000..ea17d35 --- /dev/null +++ b/common/range_header/lib/range_header.dart @@ -0,0 +1,4 @@ +export 'src/converter.dart'; +export 'src/exception.dart'; +export 'src/range_header.dart'; +export 'src/range_header_item.dart'; diff --git a/common/range_header/lib/src/converter.dart b/common/range_header/lib/src/converter.dart new file mode 100644 index 0000000..d0ef65a --- /dev/null +++ b/common/range_header/lib/src/converter.dart @@ -0,0 +1,173 @@ +import 'dart:async'; +import 'dart:collection'; +import 'dart:convert'; +//import 'dart:io' hide BytesBuilder; +import 'dart:typed_data'; + +import 'dart:math'; +import 'package:async/async.dart'; +import 'package:charcode/ascii.dart'; +import 'range_header.dart'; + +/// A [StreamTransformer] that uses a parsed [RangeHeader] and transforms an input stream +/// into one compatible with the `multipart/byte-ranges` specification. +class RangeHeaderTransformer + extends StreamTransformerBase, List> { + final RangeHeader header; + final String boundary, mimeType; + final int totalLength; + + RangeHeaderTransformer(this.header, this.mimeType, this.totalLength, + {String? boundary}) + : boundary = boundary ?? _randomString() { + if (header.items.isEmpty) { + throw ArgumentError('`header` cannot be null or empty.'); + } + } + + /// Computes the content length that will be written to a response, given a stream of the given [totalFileSize]. + int computeContentLength(int totalFileSize) { + var len = 0; + + for (var item in header.items) { + if (item.start == -1) { + if (item.end == -1) { + len += totalFileSize; + } else { + //len += item.end + 1; + len += item.end + 1; + } + } else if (item.end == -1) { + len += totalFileSize - item.start; + //len += totalFileSize - item.start - 1; + } else { + len += item.end - item.start; + } + + // Take into consideration the fact that delimiters are written. + len += utf8.encode('--$boundary\r\n').length; + len += utf8.encode('Content-Type: $mimeType\r\n').length; + len += utf8 + .encode( + 'Content-Range: ${header.rangeUnit} ${item.toContentRange(totalLength)}/$totalLength\r\n\r\n') + .length; + len += 2; // CRLF + } + + len += utf8.encode('--$boundary--\r\n').length; + + return len; + } + + @override + Stream> bind(Stream> stream) { + var ctrl = StreamController>(); + + Future(() async { + var index = 0; + var enqueued = Queue>(); + var q = StreamQueue(stream); + + Future> absorb(int length) async { + var out = BytesBuilder(); + + while (out.length < length) { + var remaining = length - out.length; + + while (out.length < length && enqueued.isNotEmpty) { + remaining = length - out.length; + var blob = enqueued.removeFirst(); + + if (blob.length > remaining) { + enqueued.addFirst(blob.skip(remaining).toList()); + blob = blob.take(remaining).toList(); + } + + out.add(blob); + index += blob.length; + } + + if (out.length < length && await q.hasNext) { + var blob = await q.next; + remaining = length - out.length; + + if (blob.length > remaining) { + enqueued.addFirst(blob.skip(remaining).toList()); + blob = blob.take(remaining).toList(); + } + + out.add(blob); + index += blob.length; + } + + // If we get this far, and the stream is EMPTY, the user requested + // too many bytes. + if (out.length < length && enqueued.isEmpty && !(await q.hasNext)) { + throw StateError( + 'The range denoted is bigger than the size of the input stream.'); + } + } + + return out.takeBytes(); + } + + for (var item in header.items) { + var chunk = BytesBuilder(); + + // Skip until we reach the start index. + while (index < item.start) { + var remaining = item.start - index; + await absorb(remaining); + } + + // Next, absorb until we reach the end. + if (item.end == -1) { + while (enqueued.isNotEmpty) { + chunk.add(enqueued.removeFirst()); + } + while (await q.hasNext) { + chunk.add(await q.next); + } + } else { + var remaining = item.end - index; + chunk.add(await absorb(remaining)); + } + + // Next, write the boundary and data. + ctrl.add(utf8.encode('--$boundary\r\n')); + ctrl.add(utf8.encode('Content-Type: $mimeType\r\n')); + ctrl.add(utf8.encode( + 'Content-Range: ${header.rangeUnit} ${item.toContentRange(totalLength)}/$totalLength\r\n\r\n')); + ctrl.add(chunk.takeBytes()); + ctrl.add(const [$cr, $lf]); + + // If this range was unbounded, don't bother looping any further. + if (item.end == -1) break; + } + + ctrl.add(utf8.encode('--$boundary--\r\n')); + + await ctrl.close(); + }).catchError((e) { + ctrl.addError(e as Object); + return null; + }); + + return ctrl.stream; + } +} + +var _rnd = Random(); +String _randomString( + {int length = 32, + String validChars = + 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789'}) { + var len = _rnd.nextInt((length - 10)) + 10; + var buf = StringBuffer(); + + while (buf.length < len) { + buf.writeCharCode(validChars.codeUnitAt(_rnd.nextInt(validChars.length))); + } + + return buf.toString(); +} diff --git a/common/range_header/lib/src/exception.dart b/common/range_header/lib/src/exception.dart new file mode 100644 index 0000000..f5c6cf1 --- /dev/null +++ b/common/range_header/lib/src/exception.dart @@ -0,0 +1,18 @@ +class RangeHeaderParseException extends FormatException { + //@override + //final String message; + + RangeHeaderParseException(super.message); + + @override + String toString() => 'Range header parse exception: $message'; +} + +class InvalidRangeHeaderException implements Exception { + final String message; + + InvalidRangeHeaderException(this.message); + + @override + String toString() => 'Range header parse exception: $message'; +} diff --git a/common/range_header/lib/src/parser.dart b/common/range_header/lib/src/parser.dart new file mode 100644 index 0000000..2ca5ffa --- /dev/null +++ b/common/range_header/lib/src/parser.dart @@ -0,0 +1,152 @@ +import 'package:charcode/charcode.dart'; +import 'package:source_span/source_span.dart'; +import 'package:string_scanner/string_scanner.dart'; +import 'exception.dart'; +import 'range_header.dart'; +import 'range_header_impl.dart'; +import 'range_header_item.dart'; + +final RegExp _rgxInt = RegExp(r'[0-9]+'); +final RegExp _rgxWs = RegExp(r'[ \n\r\t]'); + +enum TokenType { rangeUnit, comma, int, dash, equals } + +class Token { + final TokenType type; + final SourceSpan? span; + + Token(this.type, this.span); +} + +List scan(String text, List allowedRangeUnits) { + var tokens = []; + var scanner = SpanScanner(text); + + while (!scanner.isDone) { + // Skip whitespace + scanner.scan(_rgxWs); + + if (scanner.scanChar($comma)) { + tokens.add(Token(TokenType.comma, scanner.lastSpan)); + } else if (scanner.scanChar($dash)) { + tokens.add(Token(TokenType.dash, scanner.lastSpan)); + } else if (scanner.scan(_rgxInt)) { + tokens.add(Token(TokenType.int, scanner.lastSpan)); + } else if (scanner.scanChar($equal)) { + tokens.add(Token(TokenType.equals, scanner.lastSpan)); + } else { + var matched = false; + + for (var unit in allowedRangeUnits) { + if (scanner.scan(unit)) { + tokens.add(Token(TokenType.rangeUnit, scanner.lastSpan)); + matched = true; + break; + } + } + + if (!matched) { + var ch = scanner.readChar(); + throw RangeHeaderParseException( + 'Unexpected character: "${String.fromCharCode(ch)}"'); + } + } + } + + return tokens; +} + +class Parser { + Token? _current; + int _index = -1; + final List tokens; + + Parser(this.tokens); + + Token? get current => _current; + + bool get done => _index >= tokens.length - 1; + + RangeHeaderParseException _expected(String type) { + var offset = current?.span?.start.offset; + + if (offset == null) return RangeHeaderParseException('Expected $type.'); + + Token? peek; + + if (_index < tokens.length - 1) peek = tokens[_index + 1]; + + if (peek != null && peek.span != null) { + return RangeHeaderParseException( + 'Expected $type at offset $offset, found "${peek.span!.text}" instead. \nSource:\n${peek.span?.highlight() ?? peek.type}'); + } else { + return RangeHeaderParseException( + 'Expected $type at offset $offset, but the header string ended without one.\nSource:\n${current!.span?.highlight() ?? current!.type}'); + } + } + + bool next(TokenType type) { + if (done) return false; + var tok = tokens[_index + 1]; + if (tok.type == type) { + _index++; + _current = tok; + return true; + } else { + return false; + } + } + + RangeHeader? parseRangeHeader() { + if (next(TokenType.rangeUnit)) { + var unit = current!.span!.text; + next(TokenType.equals); // Consume =, if any. + + var items = []; + var item = parseHeaderItem(); + + while (item != null) { + items.add(item); + // Parse comma + if (next(TokenType.comma)) { + item = parseHeaderItem(); + } else { + item = null; + } + } + + if (items.isEmpty) { + throw _expected('range'); + } else { + return RangeHeaderImpl(unit, items); + } + } else { + return null; + } + } + + RangeHeaderItem? parseHeaderItem() { + if (next(TokenType.int)) { + // i.e 500-544, or 600- + var start = int.parse(current!.span!.text); + if (next(TokenType.dash)) { + if (next(TokenType.int)) { + return RangeHeaderItem(start, int.parse(current!.span!.text)); + } else { + return RangeHeaderItem(start); + } + } else { + throw _expected('"-"'); + } + } else if (next(TokenType.dash)) { + // i.e. -599 + if (next(TokenType.int)) { + return RangeHeaderItem(-1, int.parse(current!.span!.text)); + } else { + throw _expected('integer'); + } + } else { + return null; + } + } +} diff --git a/common/range_header/lib/src/range_header.dart b/common/range_header/lib/src/range_header.dart new file mode 100644 index 0000000..ee0fa4d --- /dev/null +++ b/common/range_header/lib/src/range_header.dart @@ -0,0 +1,68 @@ +import 'dart:collection'; +import 'exception.dart'; +import 'parser.dart'; +import 'range_header_item.dart'; +import 'range_header_impl.dart'; + +/// Represents the contents of a parsed `Range` header. +abstract class RangeHeader { + /// Returns an immutable list of the ranges that were parsed. + UnmodifiableListView get items; + + const factory RangeHeader(Iterable items, + {String? rangeUnit}) = _ConstantRangeHeader; + + /// Eliminates any overlapping [items], sorts them, and folds them all into the most efficient representation possible. + static UnmodifiableListView foldItems( + Iterable items) { + var out = {}; + + for (var item in items) { + // Remove any overlapping items, consolidate them. + while (out.any((x) => x.overlaps(item))) { + var f = out.firstWhere((x) => x.overlaps(item)); + out.remove(f); + item = item.consolidate(f); + } + + out.add(item); + } + + return UnmodifiableListView(out.toList()..sort()); + } + + /// Attempts to parse a [RangeHeader] from its [text] representation. + /// + /// You can optionally pass a custom list of [allowedRangeUnits]. + /// The default is `['bytes']`. + /// + /// If [fold] is `true`, the items will be folded into the most compact + /// possible representation. + /// + factory RangeHeader.parse(String text, + {Iterable? allowedRangeUnits, bool fold = true}) { + var tokens = scan(text, allowedRangeUnits?.toList() ?? ['bytes']); + var parser = Parser(tokens); + var header = parser.parseRangeHeader(); + if (header == null) { + throw InvalidRangeHeaderException('Header is null'); + } + var items = foldItems(header.items); + return RangeHeaderImpl(header.rangeUnit, items); + } + + /// Returns this header's range unit. Most commonly, this is `bytes`. + String? get rangeUnit; +} + +class _ConstantRangeHeader implements RangeHeader { + final Iterable items_; + @override + final String? rangeUnit; + + const _ConstantRangeHeader(this.items_, {this.rangeUnit = 'bytes'}); + + @override + UnmodifiableListView get items => + UnmodifiableListView(items_); +} diff --git a/common/range_header/lib/src/range_header_impl.dart b/common/range_header/lib/src/range_header_impl.dart new file mode 100644 index 0000000..865d387 --- /dev/null +++ b/common/range_header/lib/src/range_header_impl.dart @@ -0,0 +1,20 @@ +import 'dart:collection'; +import 'range_header.dart'; +import 'range_header_item.dart'; + +/// Represents the contents of a parsed `Range` header. +class RangeHeaderImpl implements RangeHeader { + UnmodifiableListView? _cached; + final List _items = []; + + RangeHeaderImpl(this.rangeUnit, [List items = const []]) { + _items.addAll(items); + } + + @override + UnmodifiableListView get items => + _cached ??= UnmodifiableListView(_items); + + @override + final String? rangeUnit; +} diff --git a/common/range_header/lib/src/range_header_item.dart b/common/range_header/lib/src/range_header_item.dart new file mode 100644 index 0000000..f8e7170 --- /dev/null +++ b/common/range_header/lib/src/range_header_item.dart @@ -0,0 +1,93 @@ +import 'dart:math'; + +import 'package:quiver/core.dart'; + +/// Represents an individual range, with an optional start index and optional end index. +class RangeHeaderItem implements Comparable { + /// The index at which this chunk begins. May be `-1`. + final int start; + + /// The index at which this chunk ends. May be `-1`. + final int end; + + const RangeHeaderItem([this.start = -1, this.end = -1]); + + /// Joins two items together into the largest possible range. + RangeHeaderItem consolidate(RangeHeaderItem other) { + if (!(other.overlaps(this))) { + throw ArgumentError('The two ranges do not overlap.'); + } + return RangeHeaderItem(min(start, other.start), max(end, other.end)); + } + + @override + int get hashCode => hash2(start, end); + + @override + bool operator ==(other) => + other is RangeHeaderItem && other.start == start && other.end == end; + + bool overlaps(RangeHeaderItem other) { + if (other.start <= start) { + return other.end < start; + } else if (other.start > start) { + return other.start <= end; + } + return false; + } + + @override + int compareTo(RangeHeaderItem other) { + if (other.start > start) { + return -1; + } else if (other.start == start) { + if (other.end == end) { + return 0; + } else if (other.end < end) { + return 1; + } else { + return -1; + } + } else if (other.start < start) { + return 1; + } else { + return -1; + } + } + + @override + String toString() { + if (start > -1 && end > -1) { + return '$start-$end'; + } else if (start > -1) { + return '$start-'; + } else { + return '-$end'; + } + } + + /// Creates a representation of this instance suitable for a `Content-Range` header. + /// + /// This can only be used if the user request only one range. If not, send a + /// `multipart/byteranges` response. + /// + /// Please adhere to the standard!!! + /// http://httpwg.org/specs/rfc7233.html + + String toContentRange([int? totalSize]) { + // var maxIndex = totalSize != null ? (totalSize - 1).toString() : '*'; + var s = start > -1 ? start : 0; + + if (end == -1) { + if (totalSize == null) { + throw UnsupportedError( + 'If the end of this range is unknown, `totalSize` must not be null.'); + } else { + // if (end == totalSize - 1) { + return '$s-${totalSize - 1}/$totalSize'; + } + } + + return '$s-$end/$totalSize'; + } +} diff --git a/common/range_header/pubspec.lock b/common/range_header/pubspec.lock new file mode 100644 index 0000000..d5af4c6 --- /dev/null +++ b/common/range_header/pubspec.lock @@ -0,0 +1,418 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: "direct main" + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + charcode: + dependency: "direct main" + description: + name: charcode + sha256: fb0f1107cac15a5ea6ef0a6ef71a807b9e4267c713bb93e00e92d737cc8dbd8a + url: "https://pub.dev" + source: hosted + version: "1.4.0" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: "direct dev" + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: "direct dev" + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: "direct dev" + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + quiver: + dependency: "direct main" + description: + name: quiver + sha256: ea0b925899e64ecdfbf9c7becb60d5b50e706ade44a85b2363be2a22d88117d2 + url: "https://pub.dev" + source: hosted + version: "3.2.2" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: "direct main" + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: "direct main" + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/range_header/pubspec.yaml b/common/range_header/pubspec.yaml new file mode 100644 index 0000000..7d6ec4c --- /dev/null +++ b/common/range_header/pubspec.yaml @@ -0,0 +1,18 @@ +name: platform_range_header +version: 6.3.0 +description: Range header parser for Dart. Beyond parsing, a stream transformer is included. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/range_header +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + async: ^2.11.0 + charcode: ^1.3.0 + quiver: ^3.2.0 + source_span: ^1.10.0 + string_scanner: ^1.2.0 +dev_dependencies: + file: ^7.0.0 + http_parser: ^4.0.0 + logging: ^1.0.1 + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/range_header/test/all_test.dart b/common/range_header/test/all_test.dart new file mode 100644 index 0000000..b56a49d --- /dev/null +++ b/common/range_header/test/all_test.dart @@ -0,0 +1,95 @@ +import 'package:platform_range_header/range_header.dart'; +import 'package:test/test.dart'; + +final Matcher throwsRangeParseException = + throwsA(const TypeMatcher()); + +final Matcher throwsInvalidRangeHeaderException = + throwsA(const TypeMatcher()); + +void main() { + group('one item', () { + test('start and end', () { + var r = RangeHeader.parse('bytes 1-200'); + expect(r.items, hasLength(1)); + expect(r.items.first.start, 1); + expect(r.items.first.end, 200); + }); + + test('start only', () { + var r = RangeHeader.parse('bytes 1-'); + expect(r.items, hasLength(1)); + expect(r.items.first.start, 1); + expect(r.items.first.end, -1); + }); + + test('end only', () { + var r = RangeHeader.parse('bytes -200'); + print(r.items); + expect(r.items, hasLength(1)); + expect(r.items.first.start, -1); + expect(r.items.first.end, 200); + }); + }); + + group('multiple items', () { + test('three items', () { + var r = RangeHeader.parse('bytes 1-20, 21-40, 41-60'); + print(r.items); + expect(r.items, hasLength(3)); + expect(r.items[0].start, 1); + expect(r.items[0].end, 20); + expect(r.items[1].start, 21); + expect(r.items[1].end, 40); + expect(r.items[2].start, 41); + expect(r.items[2].end, 60); + }); + + test('one item without end', () { + var r = RangeHeader.parse('bytes 1-20, 21-'); + print(r.items); + expect(r.items, hasLength(2)); + expect(r.items[0].start, 1); + expect(r.items[0].end, 20); + expect(r.items[1].start, 21); + expect(r.items[1].end, -1); + }); + }); + + group('failures', () { + test('no start with no end', () { + expect(() => RangeHeader.parse('-'), throwsInvalidRangeHeaderException); + }); + }); + + group('exceptions', () { + test('invalid character', () { + expect(() => RangeHeader.parse('!!!'), throwsRangeParseException); + }); + + test('no ranges', () { + expect(() => RangeHeader.parse('bytes'), throwsRangeParseException); + }); + + test('no dash after int', () { + expect(() => RangeHeader.parse('bytes 3'), throwsRangeParseException); + expect(() => RangeHeader.parse('bytes 3,'), throwsRangeParseException); + expect(() => RangeHeader.parse('bytes 3 24'), throwsRangeParseException); + }); + + test('no int after dash', () { + expect(() => RangeHeader.parse('bytes -,'), throwsRangeParseException); + }); + }); + + group('complete coverage', () { + test('exception toString()', () { + var m = RangeHeaderParseException('hey'); + expect(m.toString(), contains('hey')); + }); + }); + + test('content-range', () { + expect(RangeHeader.parse('bytes 1-2').items[0].toContentRange(3), '1-2/3'); + }); +} diff --git a/common/symbol_table/AUTHORS.md b/common/symbol_table/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/symbol_table/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/symbol_table/CHANGELOG.md b/common/symbol_table/CHANGELOG.md new file mode 100644 index 0000000..09d2c04 --- /dev/null +++ b/common/symbol_table/CHANGELOG.md @@ -0,0 +1,66 @@ +# Change Log + +## 5.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 5.1.0 + +* Updated `lints` to 3.0.0 +* Fixed lints warnings + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 + +## 4.0.0 + +* Require Dart >= 2.17 + +## 3.0.1 + +* Fixed license link + +## 3.0.0 + +* Upgraded from `pendantic` to `lints` linter +* Published as `platform_symbol_table` package +* Fixed linter warnings + +## 2.0.2 + +* Resolved static analysis warnings + +## 2.0.1 + +* Resolved static analysis warnings + +## 2.0.0 + +* Migrated to work with Dart SDK 2.12.x NNBD + +## 1.0.4 + +* Added `context` to `SymbolTable`. + +## 1.0.3 + +* Converted `Visibility` into a `Comparable` class. +* Renamed `add` -> `create`, `put` -> `assign`, and `allVariablesOfVisibility` -> `allVariablesWithVisibility`. +* Added tests for `Visibility` comparing, and `depth`. +* Added `uniqueName()` to `SymbolTable`. +* Fixed a typo in `remove` that would have prevented it from working correctly. + +## 1.0.2 + +* Added `depth` to `SymbolTable`. +* Added `symbolTable` to `Variable`. +* Deprecated the redundant `Constant` class. +* Deprecated `Variable.markAsPrivate()`. +* Added the `Visibility` enumerator. +* Added the field `visibility` to `Variable`. diff --git a/common/symbol_table/LICENSE b/common/symbol_table/LICENSE new file mode 100644 index 0000000..df5e063 --- /dev/null +++ b/common/symbol_table/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/common/symbol_table/README.md b/common/symbol_table/README.md new file mode 100644 index 0000000..aabadf3 --- /dev/null +++ b/common/symbol_table/README.md @@ -0,0 +1,171 @@ +# Belatuk Symbol Table + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_symbol_table?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/symbol_table/LICENSE) + +**Replacement of `package:symbol_table` with breaking changes to support NNBD.** + +A generic symbol table implementation in Dart, with support for scopes and constants. +The symbol tables produced by this package are hierarchical (in this case, tree-shaped), +and utilize basic memoization to speed up repeated lookups. + +## Variables + +To represent a symbol, use `Variable`. I opted for the name `Variable` to avoid conflict with the Dart primitive `Symbol`. + +```dart +var foo = Variable('foo'); +var bar = Variable('bar', value: 'baz'); + +// Call `lock` to mark a symbol as immutable. +var shelley = Variable('foo', value: 'bar')..lock(); + +foo.value = 'bar'; +shelley.value = 'Mary'; // Throws a StateError - constants cannot be overwritten. + +foo.lock(); +foo.value = 'baz'; // Also throws a StateError - Once a variable is locked, it cannot be overwritten. +``` + +## Visibility + +Variables are *public* by default, but can also be marked as *private* or *protected*. This can be helpful if you are trying to determine which symbols should be exported from a library or class. + +```dart +myVariable.visibility = Visibility.protected; +myVariable.visibility = Visibility.private; +``` + +## Symbol Tables + +It's easy to create a basic symbol table: + +```dart +var mySymbolTable = SymbolTable(); +var doubles = SymbolTable(values: { + 'hydrogen': 1.0, + 'avogadro': 6.022e23 +}); + +// Create a new variable within the scope. +doubles.create('one'); +doubles.create('one', value: 1.0); +doubles.create('one', value: 1.0, constant: true); + +// Set a variable within an ancestor, OR create a new variable if none exists. +doubles.assign('two', 2.0); + +// Completely remove a variable. +doubles.remove('two'); + +// Find a symbol, either in this symbol table or an ancestor. +var symbol = doubles.resolve('one'); + +// Find OR create a symbol. +var symbol = doubles.resolveOrCreate('one'); +var symbol = doubles.resolveOrCreate('one', value: 1.0); +var symbol = doubles.resolveOrCreate('one', value: 1.0, constant: true); +``` + +## Exporting Symbols + +Due to the tree structure of symbol tables, it is extremely easy to +extract a linear list of distinct variables, with variables lower in the hierarchy superseding their parents +(effectively accomplishing variable shadowing). + +```dart +var allSymbols = mySymbolTable.allVariables; +``` + +We can also extract symbols which are *not* private. This helps us export symbols from libraries +or classes. + +```dart +var exportedSymbols = mySymbolTable.allPublicVariables; +``` + +It's easy to extract symbols of a given visibility: + +```dart +var exportedSymbols = mySymbolTable.allVariablesWithVisibility(Visibility.protected); +``` + +## Child Scopes + +There are three ways to create a new symbol table: + +### Regular Children + +This is what most interpreters need; it simply creates a symbol table with the current symbol table +as its parent. The new scope can define its own symbols, which will only shadow the ancestors within the +correct scope. + +```dart +var child = mySymbolTable.createChild(); +var child = mySymbolTable.createChild(values: {...}); +``` + +#### Depth + +Every symbol table has an associated `depth` attached to it, with the `depth` at the root +being `0`. When `createChild` is called, the resulting child has an incremented `depth`. + +### Clones + +This creates a scope at the same level as the current one, with all the same variables. + +```dart +var clone = mySymbolTable.clone(); +``` + +### Forked Scopes + +If you are implementing a language with closure functions, you might consider looking into this. +A forked scope is a scope identical to the current one, but instead of merely copying references +to variables, the values of variables are copied into new ones. + +The new scope is essentially a "frozen" version of the current one. + +It is also effectively orphaned - though it is aware of its `parent`, the parent scope is unaware +that the forked scope is a child. Thus, calls to `resolve` may return old variables, if a parent +has called `remove` on a symbol. + +```dart +var forked = mySymbolTable.fork(); +var forked = mySymbolTable.fork(values: {...}); +``` + +## Creating Names + +In languages with block scope, oftentimes, identifiers will collide within a global scope. +To avoid this, symbol tables expose a `uniqueName()` method that simply attaches a numerical suffix to +an input name. The name is guaranteed to never be repeated within a specific scope. + +```dart +var name0 = mySymbolTable.uniqueName('foo'); // foo0 +var name1 = mySymbolTable.uniqueName('foo'); // foo1 +var name2 = mySymbolTable.uniqueName('foo'); // foo2 +``` + +## `this` Context + +Many languages handle a sort of `this` context that values within a scope may +optionally be resolved against. Symbol tables can easily set their context +as follows: + +```dart +void foo() { + mySymbolTable.context = thisContext; +} +``` + +Resolution of the `context` getter functions just like a symbol; if none is +set locally, then it will refer to the parent. + +```dart +void bar() { + mySymbolTable.context = thisContext; + expect(mySymbolTable.createChild().createChild().context, thisContext); +} +``` diff --git a/common/symbol_table/analysis_options.yaml b/common/symbol_table/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/symbol_table/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/symbol_table/example/main.dart b/common/symbol_table/example/main.dart new file mode 100644 index 0000000..82078fc --- /dev/null +++ b/common/symbol_table/example/main.dart @@ -0,0 +1,26 @@ +import 'package:platform_symbol_table/symbol_table.dart'; + +void main(List args) { + //var mySymbolTable = SymbolTable(); + var doubles = + SymbolTable(values: {'hydrogen': 1.0, 'avogadro': 6.022e23}); + +// Create a new variable within the scope. + doubles.create('one'); + doubles.create('one', value: 1.0); + doubles.create('one', value: 1.0, constant: true); + +// Set a variable within an ancestor, OR create a new variable if none exists. + doubles.assign('two', 2.0); + +// Completely remove a variable. + doubles.remove('two'); + +// Find a symbol, either in this symbol table or an ancestor. + //var symbol1 = doubles.resolve('one'); + +// Find OR create a symbol. + //var symbol2 = doubles.resolveOrCreate('one'); + //var symbol3 = doubles.resolveOrCreate('one', value: 1.0); + //var symbol4 = doubles.resolveOrCreate('one', value: 1.0, constant: true); +} diff --git a/common/symbol_table/lib/src/symbol_table.dart b/common/symbol_table/lib/src/symbol_table.dart new file mode 100644 index 0000000..ae07fd0 --- /dev/null +++ b/common/symbol_table/lib/src/symbol_table.dart @@ -0,0 +1,322 @@ +library symbol_table; + +import 'package:collection/collection.dart' show IterableExtension; +part 'variable.dart'; +part 'visibility.dart'; + +/// A hierarchical mechanism to hold a set of variables, which supports scoping and constant variables. +class SymbolTable { + final List> _children = []; + final Map?> _lookupCache = {}; + final Map _names = {}; + final List> _variables = []; + int _depth = 0; + T? _context; + SymbolTable? _parent, _root; + + /// Initializes an empty symbol table. + /// + /// You can optionally provide a [Map] of starter [values]. + SymbolTable({Map values = const {}}) { + if (values.isNotEmpty == true) { + values.forEach((k, v) { + _variables.add(Variable._(k, this, value: v)); + }); + } + } + + /// Returns the nearest context this symbol table belongs to. Returns `null` if none was set within the entire tree. + /// + /// This can be used to bind values to a `this` scope within a compiler. + T? get context { + SymbolTable? search = this; + + while (search != null) { + if (search._context != null) return search._context; + search = search._parent; + } + + return null; + } + + /// Sets a local context for values within this scope to be resolved against. + set context(T? value) { + _context = value; + } + + /// The depth of this symbol table within the tree. At the root, this is `0`. + int get depth => _depth; + + /// Returns `true` if this scope has no parent. + bool get isRoot => _parent == null; + + /// Gets the parent of this symbol table. + SymbolTable? get parent => _parent; + + /// Resolves the symbol table at the very root of the hierarchy. + /// + /// This value is memoized to speed up future lookups. + SymbolTable? get root { + if (_root != null) return _root; + + var out = this; + + while (out._parent != null) { + out = out._parent!; + } + + return _root = out; + } + + /// Retrieves every variable within this scope and its ancestors. + /// + /// Variable names will not be repeated; this produces the effect of + /// shadowed variables. + /// + /// This list is unmodifiable. + List> get allVariables { + var distinct = []; + var out = >[]; + + void crawl(SymbolTable table) { + for (var v in table._variables) { + if (!distinct.contains(v.name)) { + distinct.add(v.name); + out.add(v); + } + } + + if (table._parent != null) crawl(table._parent!); + } + + crawl(this); + return List>.unmodifiable(out); + } + + /// Helper for calling [allVariablesWithVisibility] to fetch all public variables. + List> get allPublicVariables { + return allVariablesWithVisibility(Visibility.public); + } + + /// Use [allVariablesWithVisibility] instead. + @Deprecated("allVariablesWithVisibility") + List> allVariablesOfVisibility(Visibility visibility) { + return allVariablesWithVisibility(visibility); + } + + /// Retrieves every variable of the given [visibility] within this scope and its ancestors. + /// + /// Variable names will not be repeated; this produces the effect of + /// shadowed variables. + /// + /// Use this to "export" symbols out of a library or class. + /// + /// This list is unmodifiable. + List> allVariablesWithVisibility(Visibility visibility) { + var distinct = []; + var out = >[]; + + void crawl(SymbolTable table) { + for (var v in table._variables) { + if (!distinct.contains(v.name) && v.visibility == visibility) { + distinct.add(v.name); + out.add(v); + } + } + + if (table._parent != null) crawl(table._parent!); + } + + crawl(this); + return List>.unmodifiable(out); + } + + Variable? operator [](String name) => resolve(name); + + void operator []=(String name, T value) { + assign(name, value); + } + + void _wipeLookupCache(String key) { + _lookupCache.remove(key); + for (var c in _children) { + c._wipeLookupCache(key); + } + } + + /// Use [create] instead. + @Deprecated("create") + Variable add(String name, {T? value, bool? constant}) { + return create(name, value: value, constant: constant); + } + + /// Create a new variable *within this scope*. + /// + /// You may optionally provide a [value], or mark the variable as [constant]. + Variable create(String name, {T? value, bool? constant}) { + // Check if it exists first. + if (_variables.any((v) => v.name == name)) { + throw StateError( + 'A symbol named "$name" already exists within the current context.'); + } + + _wipeLookupCache(name); + var v = Variable._(name, this, value: value); + if (constant == true) v.lock(); + _variables.add(v); + return v; + } + + /// Use [assign] instead. + @Deprecated("assign") + Variable put(String name, T value) { + return assign(name, value); + } + + /// Assigns a [value] to the variable with the given [name], or creates a new variable. + /// + /// You cannot use this method to assign constants. + /// + /// Returns the variable whose value was just assigned. + Variable assign(String name, T value) { + return resolveOrCreate(name)..value = value; + } + + /// Removes the variable with the given [name] from this scope, or an ancestor. + /// + /// Returns the deleted variable, or `null`. + /// + /// *Note: This may cause [resolve] calls in [fork]ed scopes to return `null`.* + /// *Note: There is a difference between symbol tables created via [fork], [createdChild], and [clone].* + Variable? remove(String name) { + SymbolTable? search = this; + + while (search != null) { + var variable = search._variables.firstWhereOrNull((v) => v.name == name); + + if (variable != null) { + search._wipeLookupCache(name); + search._variables.remove(variable); + return variable; + } + + search = search._parent; + } + + return null; + } + + /// Finds the variable with the given name, either within this scope or an ancestor. + /// + /// Returns `null` if none has been found. + Variable? resolve(String name) { + var v = _lookupCache.putIfAbsent(name, () { + var variable = _variables.firstWhereOrNull((v) => v.name == name); + + if (variable != null) { + return variable; + } else if (_parent != null) { + return _parent?.resolve(name); + } else { + return null; + } + }); + + if (v == null) { + _lookupCache.remove(name); + return null; + } else { + return v; + } + } + + /// Finds the variable with the given name, either within this scope or an ancestor. + /// Creates a new variable if none was found. + /// + /// If a new variable is created, you may optionally give it a [value]. + /// You can also mark the new variable as a [constant]. + Variable resolveOrCreate(String name, {T? value, bool? constant}) { + var resolved = resolve(name); + if (resolved != null) return resolved; + return create(name, value: value, constant: constant); + } + + /// Creates a child scope within this one. + /// + /// You may optionally provide starter [values]. + SymbolTable createChild({Map values = const {}}) { + var child = SymbolTable(values: values); + child + .._depth = _depth + 1 + .._parent = this + .._root = _root; + _children.add(child); + return child; + } + + /// Creates a scope identical to this one, but with no children. + /// + /// The [parent] scope will see the new scope as a child. + SymbolTable clone() { + var table = SymbolTable(); + table._variables.addAll(_variables); + table + .._depth = _depth + .._parent = _parent + .._root = _root; + _parent?._children.add(table); + return table; + } + + /// Creates a *forked* scope, derived from this one. + /// You may provide starter [values]. + /// + /// As opposed to [createChild], all variables in the resulting forked + /// scope will be *copies* of those in this class. This makes forked + /// scopes useful for implementations of concepts like closure functions, + /// where the current values of variables are trapped. + /// + /// The forked scope is essentially orphaned and stands alone; although its + /// [parent] getter will point to the parent of the original scope, the parent + /// will not be aware of the new scope's existence. + SymbolTable fork({Map values = const {}}) { + var table = SymbolTable(); + + table + .._depth = _depth + .._parent = _parent + .._root = _root; + + table._variables.addAll(_variables.map((Variable v) { + var variable = Variable._(v.name, this, value: v.value as T?); + variable.visibility = v.visibility; + + if (v.isImmutable) variable.lock(); + return variable; + })); + + return table; + } + + /// Returns a variation on the input [name] that is guaranteed to never be repeated within this scope. + /// + /// The variation will the input [name], but with a numerical suffix appended. + /// Ex. `foo1`, `bar24` + String uniqueName(String name) { + var count = 0; + SymbolTable? search = this; + + while (search != null) { + if (search._names.containsKey(name)) count += search._names[name]!; + search = search._parent; + } + + _names.putIfAbsent(name, () => 0); + var n = _names[name]; + if (n != null) { + n++; + _names[name] = n; + } + return '$name$count'; + } +} diff --git a/common/symbol_table/lib/src/variable.dart b/common/symbol_table/lib/src/variable.dart new file mode 100644 index 0000000..eea01cb --- /dev/null +++ b/common/symbol_table/lib/src/variable.dart @@ -0,0 +1,31 @@ +part of 'symbol_table.dart'; + +/// Holds a symbol, the value of which may change or be marked immutable. +class Variable { + final String name; + final SymbolTable? symbolTable; + Visibility visibility = Visibility.public; + bool _locked = false; + T? _value; + + Variable._(this.name, this.symbolTable, {T? value}) { + _value = value; + } + + /// If `true`, then the value of this variable cannot be overwritten. + bool get isImmutable => _locked; + + T? get value => _value; + + set value(T? value) { + if (_locked) { + throw StateError('The value of constant "$name" cannot be overwritten.'); + } + _value = value; + } + + /// Locks this symbol, and prevents its [value] from being overwritten. + void lock() { + _locked = true; + } +} diff --git a/common/symbol_table/lib/src/visibility.dart b/common/symbol_table/lib/src/visibility.dart new file mode 100644 index 0000000..be53da3 --- /dev/null +++ b/common/symbol_table/lib/src/visibility.dart @@ -0,0 +1,31 @@ +part of 'symbol_table.dart'; + +/// Represents the visibility of a symbol. +/// +/// Symbols may be [public], [protected], or [private]. +/// The significance of a symbol's visibility is semantic and specific to the interpreter/compiler; +/// this package attaches no specific meaning to it. +/// +/// [Visibility] instances can be compared using the `<`, `<=`, `>`, and `>=` operators. +/// The evaluation of the aforementioned operators is logical; +/// for example, a [private] symbol is *less visible* than a [public] symbol, +/// so [private] < [public]. +/// +/// In a nutshell: [private] < [protected] < [public]. +class Visibility implements Comparable { + static const Visibility private = Visibility._(0); + static const Visibility protected = Visibility._(1); + static const Visibility public = Visibility._(2); + final int _n; + const Visibility._(this._n); + + bool operator >(Visibility other) => _n > other._n; + bool operator >=(Visibility other) => _n >= other._n; + bool operator <(Visibility other) => _n < other._n; + bool operator <=(Visibility other) => _n <= other._n; + + @override + int compareTo(Visibility other) { + return _n.compareTo(other._n); + } +} diff --git a/common/symbol_table/lib/symbol_table.dart b/common/symbol_table/lib/symbol_table.dart new file mode 100644 index 0000000..d945b7a --- /dev/null +++ b/common/symbol_table/lib/symbol_table.dart @@ -0,0 +1 @@ +export 'src/symbol_table.dart'; diff --git a/common/symbol_table/pubspec.lock b/common/symbol_table/pubspec.lock new file mode 100644 index 0000000..00ac0dd --- /dev/null +++ b/common/symbol_table/pubspec.lock @@ -0,0 +1,402 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: "direct main" + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/symbol_table/pubspec.yaml b/common/symbol_table/pubspec.yaml new file mode 100644 index 0000000..cf7c80f --- /dev/null +++ b/common/symbol_table/pubspec.yaml @@ -0,0 +1,11 @@ +name: platform_symbol_table +version: 5.2.0 +description: A generic symbol table implementation in Dart, with support for scopes and constants. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/symbol_table +environment: + sdk: '>=3.3.0 <4.0.0' +dependencies: + collection: ^1.17.0 +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/symbol_table/test/all_test.dart b/common/symbol_table/test/all_test.dart new file mode 100644 index 0000000..48adb2a --- /dev/null +++ b/common/symbol_table/test/all_test.dart @@ -0,0 +1,144 @@ +import 'package:platform_symbol_table/symbol_table.dart'; +import 'package:test/test.dart'; + +void main() { + late SymbolTable scope; + + setUp(() { + scope = SymbolTable(values: {'one': 1}); + }); + + test('starter values', () { + expect(scope['one']?.value, 1); + }); + + test('add', () { + var two = scope.create('two', value: 2); + expect(two.value, 2); + expect(two.isImmutable, isFalse); + }); + + test('put', () { + var one = scope.resolve('one'); + var child = scope.createChild(); + var three = child.assign('one', 3); + expect(three.value, 3); + expect(three, one); + }); + + test('private', () { + var three = scope.create('three', value: 3) + ..visibility = Visibility.private; + expect(scope.allVariables, contains(three)); + expect( + scope.allVariablesWithVisibility(Visibility.private), contains(three)); + expect(scope.allPublicVariables, isNot(contains(three))); + }); + + test('protected', () { + var three = scope.create('three', value: 3) + ..visibility = Visibility.protected; + expect(scope.allVariables, contains(three)); + expect(scope.allVariablesWithVisibility(Visibility.protected), + contains(three)); + expect(scope.allPublicVariables, isNot(contains(three))); + }); + + test('constants', () { + var two = scope.create('two', value: 2, constant: true); + expect(two.value, 2); + expect(two.isImmutable, isTrue); + expect(() => scope['two'] = 3, throwsStateError); + }); + + test('lock', () { + expect(scope['one']?.isImmutable, isFalse); + scope['one']!.lock(); + expect(scope['one']?.isImmutable, isTrue); + expect(() => scope['one'] = 2, throwsStateError); + }); + + test('child', () { + expect(scope.createChild().createChild().resolve('one')!.value, 1); + }); + + test('clone', () { + var child = scope.createChild(); + var clone = child.clone(); + expect(clone.resolve('one'), child.resolve('one')); + expect(clone.parent, child.parent); + }); + + test('fork', () { + var fork = scope.fork(); + scope.assign('three', 3); + + expect(scope.resolve('three'), isNotNull); + expect(fork.resolve('three'), isNull); + }); + + test('remove', () { + var one = scope.remove('one')!; + expect(one.value, 1); + + expect(scope.resolve('one'), isNull); + }); + + test('root', () { + expect(scope.isRoot, isTrue); + expect(scope.root, scope); + + var child = scope + .createChild() + .createChild() + .createChild() + .createChild() + .createChild() + .createChild() + .createChild(); + expect(child.isRoot, false); + expect(child.root, scope); + }); + + test('visibility comparisons', () { + expect([Visibility.private, Visibility.protected], + everyElement(lessThan(Visibility.public))); + expect(Visibility.private, lessThan(Visibility.protected)); + expect(Visibility.protected, greaterThan(Visibility.private)); + expect(Visibility.public, greaterThan(Visibility.private)); + expect(Visibility.public, greaterThan(Visibility.protected)); + }); + + test('depth', () { + expect(scope.depth, 0); + expect(scope.clone().depth, 0); + expect(scope.fork().depth, 0); + expect(scope.createChild().depth, 1); + expect(scope.createChild().createChild().depth, 2); + expect(scope.createChild().createChild().createChild().depth, 3); + }); + + test('unique name', () { + expect(scope.uniqueName('foo'), 'foo0'); + expect(scope.uniqueName('foo'), 'foo1'); + expect(scope.createChild().uniqueName('foo'), 'foo2'); + expect(scope.createChild().uniqueName('foo'), 'foo2'); + var child = scope.createChild(); + expect(child.uniqueName('foo'), 'foo2'); + expect(child.uniqueName('foo'), 'foo3'); + expect(child.createChild().uniqueName('foo'), 'foo4'); + }); + + test('context', () { + scope.context = 24; + expect(scope.context, 24); + expect(scope.createChild().context, 24); + expect(scope.createChild().createChild().context, 24); + + var child = scope.createChild().createChild()..context = 35; + expect(child.context, 35); + expect(child.createChild().context, 35); + expect(child.createChild().createChild().context, 35); + expect(scope.context, 24); + }); +} diff --git a/common/user_agent/AUTHORS.md b/common/user_agent/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/common/user_agent/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/common/user_agent/CHANGELOG.md b/common/user_agent/CHANGELOG.md new file mode 100644 index 0000000..244db2d --- /dev/null +++ b/common/user_agent/CHANGELOG.md @@ -0,0 +1,48 @@ +# Change Log + +## 5.2.0 + +* Require Dart >= 3.3 +* Updated `lints` to 4.0.0 + +## 5.1.0 + +* Updated `lints` to 3.0.0 + +## 5.0.0 + +* Require Dart >= 3.0 + +## 5.0.0-beta.1 + +* Require Dart >= 3.0 + +## 4.0.0 + +* Require Dart >= 2.17 + +## 3.1.0 + +* Fixed license link +* Upgraded from `pendantic` to `lints` linter +* Fixed linter warnings + +## 3.0.2 + +* Updated repository links + +## 3.0.1 + +* Updated to use non nullable results + +## 3.0.0 + +* Migrated to support Dart SDK 2.12.x NNBD + +## 2.0.0 + +* Dart 2 updates. + +## 0.0.1 + +* Initial version, created by Stagehand diff --git a/common/user_agent/LICENSE b/common/user_agent/LICENSE new file mode 100644 index 0000000..e37a346 --- /dev/null +++ b/common/user_agent/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. \ No newline at end of file diff --git a/common/user_agent/README.md b/common/user_agent/README.md new file mode 100644 index 0000000..3d9b4c1 --- /dev/null +++ b/common/user_agent/README.md @@ -0,0 +1,26 @@ +# User Agent Analyzer + +![Pub Version (including pre-releases)](https://img.shields.io/pub/v/platform_agent_analyzer?include_prereleases) +[![Null Safety](https://img.shields.io/badge/null-safety-brightgreen)](https://dart.dev/null-safety) +[![License](https://img.shields.io/github/license/dart-backend/belatuk-common-utilities)](https://github.com/dart-backend/belatuk-common-utilities/blob/main/packages/user_agent/LICENSE) + +**Replacement of `package:user_agent` with breaking changes to support NNBD.** + +A library to identify the type of devices and web browsers based on `User-Agent` string. + +Runs anywhere. + +```dart +void main() async { + app.get('/', (req, res) async { + var ua = UserAgent(req.headers.value('user-agent')); + + if (ua.isChrome) { + res.redirect('/upgrade-your-browser'); + return; + } else { + // ... + } + }); +} +``` diff --git a/common/user_agent/analysis_options.yaml b/common/user_agent/analysis_options.yaml new file mode 100644 index 0000000..ea2c9e9 --- /dev/null +++ b/common/user_agent/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml \ No newline at end of file diff --git a/common/user_agent/example/example.dart b/common/user_agent/example/example.dart new file mode 100644 index 0000000..243d9d6 --- /dev/null +++ b/common/user_agent/example/example.dart @@ -0,0 +1,7 @@ +import 'package:platform_agent_analyzer/user_agent_analyzer.dart'; + +void main() { + var ua = UserAgent( + 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.95 Safari/537.36'); + print(ua.isChrome); +} diff --git a/common/user_agent/lib/user_agent_analyzer.dart b/common/user_agent/lib/user_agent_analyzer.dart new file mode 100644 index 0000000..40c7325 --- /dev/null +++ b/common/user_agent/lib/user_agent_analyzer.dart @@ -0,0 +1,235 @@ +library platform_agent_analyzer; + +/// Utils for device detection. +class UserAgent { + bool _isChrome = false; + bool _isOpera = false; + bool _isIE = false; + bool _isFirefox = false; + bool _isWebKit = false; + String? _cachedCssPrefix; + String? _cachedPropertyPrefix; + + final String value, _lowerValue; + + static const List knownMobileUserAgentPrefixes = [ + 'w3c ', + 'w3c-', + 'acs-', + 'alav', + 'alca', + 'amoi', + 'audi', + 'avan', + 'benq', + 'bird', + 'blac', + 'blaz', + 'brew', + 'cell', + 'cldc', + 'cmd-', + 'dang', + 'doco', + 'eric', + 'hipt', + 'htc_', + 'inno', + 'ipaq', + 'ipod', + 'jigs', + 'kddi', + 'keji', + 'leno', + 'lg-c', + 'lg-d', + 'lg-g', + 'lge-', + 'lg/u', + 'maui', + 'maxo', + 'midp', + 'mits', + 'mmef', + 'mobi', + 'mot-', + 'moto', + 'mwbp', + 'nec-', + 'newt', + 'noki', + 'palm', + 'pana', + 'pant', + 'phil', + 'play', + 'port', + 'prox', + 'qwap', + 'sage', + 'sams', + 'sany', + 'sch-', + 'sec-', + 'send', + 'seri', + 'sgh-', + 'shar', + 'sie-', + 'siem', + 'smal', + 'smar', + 'sony', + 'sph-', + 'symb', + 't-mo', + 'teli', + 'tim-', + 'tosh', + 'tsm-', + 'upg1', + 'upsi', + 'vk-v', + 'voda', + 'wap-', + 'wapa', + 'wapi', + 'wapp', + 'wapr', + 'webc', + 'winw', + 'winw', + 'xda ', + 'xda-' + ]; + + static const List knownMobileUserAgentKeywords = [ + 'blackberry', + 'webos', + 'ipod', + 'lge vx', + 'midp', + 'maemo', + 'mmp', + 'mobile', + 'netfront', + 'hiptop', + 'nintendo DS', + 'novarra', + 'openweb', + 'opera mobi', + 'opera mini', + 'palm', + 'psp', + 'phone', + 'smartphone', + 'symbian', + 'up.browser', + 'up.link', + 'wap', + 'windows ce' + ]; + + static const List knownTabletUserAgentKeywords = [ + 'ipad', + 'playbook', + 'hp-tablet', + 'kindle' + ]; + + UserAgent(this.value) : _lowerValue = value.toLowerCase(); + + /// Determines if the user agent string contains the desired string. Case-insensitive. + bool contains(String needle) => _lowerValue.contains(needle.toLowerCase()); + + bool get isDesktop => isMacOS || (!isMobile && !isTablet); + + bool get isTablet => knownTabletUserAgentKeywords.any(contains); + + bool get isMobile => knownMobileUserAgentKeywords.any(contains); + + bool get isMacOS => contains('Macintosh') || contains('Mac OS X'); + + bool get isSafari => contains('Safari'); + + bool get isAndroid => contains('android'); + + bool get isAndroidPhone => contains('android') && contains('mobile'); + + bool get isAndroidTablet => contains('android') && !contains('mobile'); + + bool get isWindows => contains('windows'); + + bool get isWindowsPhone => isWindows && contains('phone'); + + bool get isWindowsTablet => isWindows && contains('touch'); + + bool get isBlackberry => + contains('blackberry') || contains('bb10') || contains('rim'); + + bool get isBlackberryPhone => isBlackberry && !contains('tablet'); + + bool get isBlackberryTablet => isBlackberry && contains('tablet'); + + /// Determines if the current device is running Chrome. + bool get isChrome { + _isChrome = value.contains('Chrome', 0); + return _isChrome; + } + + /// Determines if the current device is running Opera. + bool get isOpera { + _isOpera = value.contains('Opera', 0); + return _isOpera; + } + + /// Determines if the current device is running Internet Explorer. + bool get isIE { + _isIE = !isOpera && value.contains('Trident/', 0); + return _isIE; + } + + /// Determines if the current device is running Firefox. + bool get isFirefox { + _isFirefox = value.contains('Firefox', 0); + return _isFirefox; + } + + /// Determines if the current device is running WebKit. + bool get isWebKit { + _isWebKit = !isOpera && value.contains('WebKit', 0); + return _isWebKit; + } + + /// Gets the CSS property prefix for the current platform. + String get cssPrefix { + var prefix = _cachedCssPrefix; + if (prefix != null) return prefix; + if (isFirefox) { + prefix = '-moz-'; + } else if (isIE) { + prefix = '-ms-'; + } else if (isOpera) { + prefix = '-o-'; + } else { + prefix = '-webkit-'; + } + return _cachedCssPrefix = prefix; + } + + /// Prefix as used for JS property names. + String get propertyPrefix { + var prefix = _cachedPropertyPrefix; + if (prefix != null) return prefix; + if (isFirefox) { + prefix = 'moz'; + } else if (isIE) { + prefix = 'ms'; + } else if (isOpera) { + prefix = 'o'; + } else { + prefix = 'webkit'; + } + return _cachedPropertyPrefix = prefix; + } +} diff --git a/common/user_agent/pubspec.lock b/common/user_agent/pubspec.lock new file mode 100644 index 0000000..19e27ee --- /dev/null +++ b/common/user_agent/pubspec.lock @@ -0,0 +1,402 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: transitive + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: e3493833ea012784c740e341952298f1cc77f1f01b1bbc3eb4eecf6984fb7f43 + url: "https://pub.dev" + source: hosted + version: "1.11.1" + crypto: + dependency: transitive + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: dfd5a80599cf0165756e3181807ed3e77daf6dd4137caaad72d0b7931597650b + url: "https://pub.dev" + source: hosted + version: "1.0.5" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "976c774dd944a42e83e2467f4cc670daef7eed6295b10b36ae8c85bcbf828235" + url: "https://pub.dev" + source: hosted + version: "4.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "92d4488434b520a62570293fbd33bb556c7d49230791c1b4bbd973baf6d2dc67" + url: "https://pub.dev" + source: hosted + version: "2.1.1" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "7b3cfbf654f3edd0c6298ecd5be782ce997ddf0e00531b9464b55245185bbbbd" + url: "https://pub.dev" + source: hosted + version: "2.1.5" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: cc36c297b52866d203dbf9332263c94becc2fe0ceaa9681d07b6ef9807023b67 + url: "https://pub.dev" + source: hosted + version: "2.0.1" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "190222579a448b03896e0ca6eca5998fa810fda630c1d65e2f78b3f638f54812" + url: "https://pub.dev" + source: hosted + version: "0.10.13" + source_span: + dependency: transitive + description: + name: source_span + sha256: "254ee5351d6cb365c859e20ee823c3bb479bf4a293c22d17a9f1bf144ce86f7c" + url: "https://pub.dev" + source: hosted + version: "1.10.1" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "22eb7769bee38c7e032d532e8daa2e1cc901b799f603550a4db8f3a5f5173ea2" + url: "https://pub.dev" + source: hosted + version: "1.25.12" + test_api: + dependency: transitive + description: + name: test_api + sha256: fb31f383e2ee25fbbfe06b40fe21e1e458d14080e3c67e7ba0acfde4df4e0bbd + url: "https://pub.dev" + source: hosted + version: "0.7.4" + test_core: + dependency: transitive + description: + name: test_core + sha256: "84d17c3486c8dfdbe5e12a50c8ae176d15e2a771b96909a9442b40173649ccaa" + url: "https://pub.dev" + source: hosted + version: "0.6.8" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: ddfa8d30d89985b96407efce8acbdd124701f96741f2d981ca860662f1c0dc02 + url: "https://pub.dev" + source: hosted + version: "15.0.0" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/common/user_agent/pubspec.yaml b/common/user_agent/pubspec.yaml new file mode 100644 index 0000000..20450df --- /dev/null +++ b/common/user_agent/pubspec.yaml @@ -0,0 +1,9 @@ +name: platform_agent_analyzer +version: 5.2.0 +description: A library to identify the type of devices and web browsers based on User-Agent string. +homepage: https://github.com/dart-backend/belatuk-common-utilities/tree/main/packages/user_agent +environment: + sdk: '>=3.3.0 <4.0.0' +dev_dependencies: + test: ^1.24.0 + lints: ^4.0.0 diff --git a/common/user_agent/test/user_agent_test.dart b/common/user_agent/test/user_agent_test.dart new file mode 100644 index 0000000..3b92436 --- /dev/null +++ b/common/user_agent/test/user_agent_test.dart @@ -0,0 +1,30 @@ +// Copyright (c) 2017, thosakwe. All rights reserved. Use of this source code +// is governed by a BSD-style license that can be found in the LICENSE file. + +import 'package:platform_agent_analyzer/user_agent_analyzer.dart'; +import 'package:test/test.dart'; + +void main() { + test('chrome', () { + var ua = UserAgent( + 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.95 Safari/537.36'); + expect([ua.isChrome, ua.isWebKit, ua.isSafari, ua.isDesktop, ua.isMacOS], + everyElement(isTrue)); + expect([ua.isFirefox, ua.isIE, ua.isOpera, ua.isMobile, ua.isTablet], + everyElement(isFalse)); + expect([ + ua.isAndroid, + ua.isAndroidPhone, + ua.isAndroidTablet, + ua.isBlackberry, + ua.isBlackberryPhone, + ua.isBlackberryTablet, + ua.isWindows, + ua.isWindowsPhone, + ua.isWindowsTablet + ], everyElement(isFalse)); + + expect(ua.cssPrefix, equals('-webkit-')); + expect(ua.propertyPrefix, equals('webkit')); + }); +} diff --git a/drivers/rethinkdb/.gitignore b/drivers/rethinkdb/.gitignore new file mode 100644 index 0000000..a247422 --- /dev/null +++ b/drivers/rethinkdb/.gitignore @@ -0,0 +1,75 @@ +# Miscellaneous +*.class +*.log +*.pyc +*.swp +.DS_Store +.atom/ +.buildlog/ +.history +.svn/ + +# IntelliJ related +*.iml +*.ipr +*.iws +.idea/ + +# The .vscode folder contains launch configuration and tasks you configure in +# VS Code which you may wish to be included in version control, so this line +# is commented out by default. +#.vscode/ + +# Flutter/Dart/Pub related +**/doc/api/ +.dart_tool/ +.flutter-plugins +.flutter-plugins-dependencies +.packages +.pub-cache/ +.pub/ +build/ + +# Android related +**/android/**/gradle-wrapper.jar +**/android/.gradle +**/android/captures/ +**/android/gradlew +**/android/gradlew.bat +**/android/local.properties +**/android/**/GeneratedPluginRegistrant.java + +# iOS/XCode related +**/ios/**/*.mode1v3 +**/ios/**/*.mode2v3 +**/ios/**/*.moved-aside +**/ios/**/*.pbxuser +**/ios/**/*.perspectivev3 +**/ios/**/*sync/ +**/ios/**/.sconsign.dblite +**/ios/**/.tags* +**/ios/**/.vagrant/ +**/ios/**/DerivedData/ +**/ios/**/Icon? +**/ios/**/Pods/ +**/ios/**/.symlinks/ +**/ios/**/profile +**/ios/**/xcuserdata +**/ios/.generated/ +**/ios/Flutter/App.framework +**/ios/Flutter/Flutter.framework +**/ios/Flutter/Flutter.podspec +**/ios/Flutter/Generated.xcconfig +**/ios/Flutter/ephemeral +**/ios/Flutter/app.flx +**/ios/Flutter/app.zip +**/ios/Flutter/flutter_assets/ +**/ios/Flutter/flutter_export_environment.sh +**/ios/ServiceDefinitions.json +**/ios/Runner/GeneratedPluginRegistrant.* + +# Exceptions to above rules. +!**/ios/**/default.mode1v3 +!**/ios/**/default.mode2v3 +!**/ios/**/default.pbxuser +!**/ios/**/default.perspectivev3 diff --git a/drivers/rethinkdb/.metadata b/drivers/rethinkdb/.metadata new file mode 100644 index 0000000..af897ac --- /dev/null +++ b/drivers/rethinkdb/.metadata @@ -0,0 +1,10 @@ +# This file tracks properties of this Flutter project. +# Used by Flutter tool to assess capabilities and perform upgrades etc. +# +# This file should be version controlled and should not be manually edited. + +version: + revision: b22742018b3edf16c6cadd7b76d9db5e7f9064b5 + channel: stable + +project_type: package diff --git a/drivers/rethinkdb/CHANGELOG.md b/drivers/rethinkdb/CHANGELOG.md new file mode 100644 index 0000000..a87fc62 --- /dev/null +++ b/drivers/rethinkdb/CHANGELOG.md @@ -0,0 +1,43 @@ +# Change Log + +## 1.1.0 + +* Require dart >= 3.5.0 +* Updated to `lints` 5.0.0 +* Updated dependencies +* Updated README +* Fixed linter warnings + +## 1.0.1 + +* Updated README +* Fixed linter warnings + +## 1.0.0 + +* Require dart >= 3.3.0 +* Updated dependencies +* Updated README +* Fixed linter warnings +* Added `lints` 4.0.0 +* Added `hashlib` +* Removed `pbkdf2ns` + +## 0.0.4 + +* Fix readme info + +## 0.0.3 + +* Remove flutter as a dependency +* Fix failing test + +## 0.0.2 + +* Update dependencies +* Update documentation + +## 0.0.1 + +* Null safety support +* [PBKDF2](https://github.com/G0mb/pbkdf2) dependency update diff --git a/drivers/rethinkdb/LICENSE b/drivers/rethinkdb/LICENSE new file mode 100644 index 0000000..c35606d --- /dev/null +++ b/drivers/rethinkdb/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2024, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/drivers/rethinkdb/README.md b/drivers/rethinkdb/README.md new file mode 100644 index 0000000..6f23feb --- /dev/null +++ b/drivers/rethinkdb/README.md @@ -0,0 +1,60 @@ +# Belatuk RethinkDB + +A dart driver for connecting to RethinkDB, the open-source database for the realtime web. This driver is a fork of [RethinkDB Driver](https://github.com/G0mb/rethink_db) with dependencies upgraded to support Dart 3. + +## Getting Started + +### Installation + +* Start `rethinkDB` as a container service. Refer to [Running rethinkDB](doc/README.md) + +* Install from [Pub](https://pub.dev/) + +```bash +dart pub add platform_driver_rethinkdb + +``` + +* Or add to the `pubspec.yaml` file + +```yaml +dependencies: + platform_driver_rethinkdb: ^1.0.0 +``` + +* Import the package into your project: + +```dart +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +``` + +### Example + +```dart +RethinkDb r = RethinkDb(); + +final connection = await r.connection( + db: 'test', + host: 'localhost', + port: 28015, + user: 'admin', + password: '', +); + +// Create table +await r.db('test').tableCreate('tv_shows').run(connection); + +// Insert data +await r.table('tv_shows').insert([ + {'name': 'Star Trek TNG', 'episodes': 178}, + {'name': 'Battlestar Galactica', 'episodes': 75} + ]).run(connection); + +// Fetch data +var result = await r.table('tv_shows').get(1).run(connection); +``` + +## References + +* For more information about RethinkDB, please visit [RethinkDB](https://rethinkdb.com/) +* For RethinkDB API documentation, please refer to [RethinkDB API](https://rethinkdb.com/api/javascript/) diff --git a/drivers/rethinkdb/analysis_options.yaml b/drivers/rethinkdb/analysis_options.yaml new file mode 100644 index 0000000..d186d5c --- /dev/null +++ b/drivers/rethinkdb/analysis_options.yaml @@ -0,0 +1,4 @@ +include: package:lints/recommended.yaml +analyzer: + errors: + constant_identifier_names: ignore diff --git a/drivers/rethinkdb/doc/README.md b/drivers/rethinkdb/doc/README.md new file mode 100644 index 0000000..e5984f5 --- /dev/null +++ b/drivers/rethinkdb/doc/README.md @@ -0,0 +1,50 @@ +# Running RethinkDb in container + +Use the following command to run rethinkDb as services using the provided docker compose file. `Rancher` or `Docker` need to be installed. Replace `nerdctl` with `docker` if using Docker. + +## Installation + +* Starting the rethinkDB container + + ```bash + nerdctl compose -f docker-compose-rethinkdb.yml -p rethink up -d + ``` + +* Stopping the rethinkDB container + + ```bash + nerdctl compose -f docker-compose-rethinkdb.yml -p rethink stop + nerdctl compose -f docker-compose-rethinkdb.yml -p rethink down + ``` + +* Checking the rethinkDB container log + + ```bash + nerdctl logs rethink-rethinkdb-1 -f + ``` + +## Compose file + +```yaml +services: + rethinkdb: + image: rethinkdb:latest + restart: "no" + ports: + - "8080:8080" + - "28015:28015" + - "29015:29015" + volumes: + - "rethinkdb:/data" + networks: + - appnet + +volumes: + rethinkdb: + driver: local + +networks: + appnet: + + +``` diff --git a/drivers/rethinkdb/doc/docker-compose-rethinkdb.yml b/drivers/rethinkdb/doc/docker-compose-rethinkdb.yml new file mode 100644 index 0000000..693133b --- /dev/null +++ b/drivers/rethinkdb/doc/docker-compose-rethinkdb.yml @@ -0,0 +1,19 @@ +services: + rethinkdb: + image: rethinkdb:latest + restart: "no" + ports: + - "8080:8080" + - "28015:28015" + - "29015:29015" + volumes: + - "rethinkdb:/data" + networks: + - appnet + +volumes: + rethinkdb: + driver: local + +networks: + appnet: diff --git a/drivers/rethinkdb/example/example.dart b/drivers/rethinkdb/example/example.dart new file mode 100644 index 0000000..4461662 --- /dev/null +++ b/drivers/rethinkdb/example/example.dart @@ -0,0 +1,43 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; + +void main() async { + RethinkDb r = RethinkDb(); + Connection conn = await r.connect( + db: 'testDB', + host: "localhost", + port: 28015, + user: "admin", + password: ""); + + // Insert data into RethinkDB + Map createdRecord = await r.table("user_account").insert([ + { + 'id': 1, + 'name': 'William', + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ] + }, + { + 'id': 2, + 'name': 'Peter', + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'nickname': 'Jo' + }, + {'id': 3, 'name': 'Firstname Last'} + ]).run(conn); + + print(createdRecord); + + // Retrive data from RethinkDB + Cursor users = + await r.table("user_account").filter({'name': 'Peter'}).run(conn); + + List userList = await users.toList(); + print(userList); + + conn.close(); +} diff --git a/drivers/rethinkdb/lib/platform_driver_rethinkdb.dart b/drivers/rethinkdb/lib/platform_driver_rethinkdb.dart new file mode 100644 index 0000000..39da167 --- /dev/null +++ b/drivers/rethinkdb/lib/platform_driver_rethinkdb.dart @@ -0,0 +1,652 @@ +library; + +import 'dart:async'; +import 'dart:io'; +import 'dart:typed_data'; + +import 'src/generated/ql2.pb.dart' as p; +import 'dart:collection'; +import 'dart:convert'; +import 'package:hashlib/hashlib.dart' as hashlib; +import 'package:crypto/crypto.dart'; +import 'dart:math' as math; + +part 'src/ast.dart'; +part 'src/errors.dart'; +part 'src/net.dart'; +part 'src/cursor.dart'; + +class AddFunction { + final RqlQuery? _rqlQuery; + + AddFunction([this._rqlQuery]); + + Add call(obj) { + if (_rqlQuery != null) { + return Add([_rqlQuery, obj]); + } else if (obj is Args) { + return Add([obj]); + } else { + throw RqlDriverError("Called add with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Add(positionalArguments); + } +} + +/// computes logical 'and' of two or more values +class AndFunction { + final RqlQuery? _rqlQuery; + + AndFunction([this._rqlQuery]); + + And call(obj) { + if (_rqlQuery != null) { + return And([_rqlQuery, obj]); + } else { + throw RqlDriverError("Called and with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return And(positionalArguments); + } +} + +/// If the test expression returns false or null, the [falseBranch] will be executed. +/// In the other cases, the [trueBranch] is the one that will be evaluated. +class BranchFunction { + Branch call(test, [trueBranch, falseBranch]) { + return Branch(test, trueBranch, falseBranch); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + return Branch.fromArgs(Args(invocation.positionalArguments)); + } +} + +class DivFunction { + final RqlQuery? _rqlQuery; + + DivFunction([this._rqlQuery]); + + Div call(number) { + if (_rqlQuery != null) { + return Div([_rqlQuery, number]); + } else if (number is Args) { + return Div([number]); + } else { + throw RqlDriverError("Called div with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Div(positionalArguments); + } +} + +class EqFunction { + final RqlQuery? _rqlQuery; + + EqFunction([this._rqlQuery]); + + Eq call(value) { + if (_rqlQuery != null) { + return Eq([_rqlQuery, value]); + } else if (value is Args) { + return Eq([value]); + } else { + throw RqlDriverError("Called eq with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Eq(positionalArguments); + } +} + +class GeFunction { + final RqlQuery? _rqlQuery; + + GeFunction([this._rqlQuery]); + + Ge call(number) { + if (_rqlQuery != null) { + return Ge([_rqlQuery, number]); + } else if (number is Args) { + return Ge([number]); + } else { + throw RqlDriverError("Called ge with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Ge(positionalArguments); + } +} + +class GtFunction { + final RqlQuery? _rqlQuery; + + GtFunction([this._rqlQuery]); + + Gt call(number) { + if (_rqlQuery != null) { + return Gt([_rqlQuery, number]); + } else if (number is Args) { + return Gt([number]); + } else { + throw RqlDriverError("Called gt with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Gt(positionalArguments); + } +} + +class LeFunction { + final RqlQuery? _rqlQuery; + + LeFunction([this._rqlQuery]); + + Le call(number) { + if (_rqlQuery != null) { + return Le([_rqlQuery, number]); + } else if (number is Args) { + return Le([number]); + } else { + throw RqlDriverError("Called le with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Le(positionalArguments); + } +} + +/// Construct a geometric line +class LineFunction { + Line call(point1, point2) { + return Line([point1, point2]); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + return Line(invocation.positionalArguments); + } +} + +class LtFunction { + final RqlQuery? _rqlQuery; + + LtFunction([this._rqlQuery]); + + Lt call(number) { + if (_rqlQuery != null) { + return Lt([_rqlQuery, number]); + } else if (number is Args) { + return Lt([number]); + } else { + throw RqlDriverError("Called lt with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Lt(positionalArguments); + } +} + +/// Executes the mappingFunction for each item in a sequence or array +/// and returns the transformed array. multiple sequences and arrays +/// may be passed +class MapFunction { + RqlMap call(seq, mappingFunction) { + return RqlMap([seq], mappingFunction); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List args = List.from(invocation.positionalArguments); + return RqlMap(args.sublist(0, args.length - 1), args.last); + } +} + +class MulFunction { + final RqlQuery? _rqlQuery; + + MulFunction([this._rqlQuery]); + + Mul call(number) { + if (_rqlQuery != null) { + return Mul([_rqlQuery, number]); + } else if (number is Args) { + return Mul([number]); + } else { + throw RqlDriverError("Called mul with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Mul(positionalArguments); + } +} + +class NeFunction { + final RqlQuery? _rqlQuery; + + NeFunction([this._rqlQuery]); + + Ne call(value) { + if (_rqlQuery != null) { + return Ne([_rqlQuery, value]); + } else if (value is Args) { + return Ne([value]); + } else { + throw RqlDriverError("Called ne with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Ne(positionalArguments); + } +} + +/// Adds fields to an object +class ObjectFunction { + final RethinkDb _rethinkdb; + + ObjectFunction(this._rethinkdb); + + RqlObject call(args) { + return RqlObject(args); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + return _rethinkdb.object(invocation.positionalArguments); + } +} + +/// computes logical 'or' of two or more values +class OrFunction { + final RqlQuery? _rqlQuery; + + OrFunction([this._rqlQuery]); + + Or call(number) { + if (_rqlQuery != null) { + return Or([_rqlQuery, number]); + } else { + throw RqlDriverError("Called or with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Or(positionalArguments); + } +} + +/// Construct a geometric polygon +class PolygonFunction { + Polygon call(point1, point2, point3) { + return Polygon([point1, point2, point3]); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + return Polygon(invocation.positionalArguments); + } +} + +/// Evaluate the expr in the context of one or more value bindings. +/// The type of the result is the type of the value returned from expr. +class RqlDoFunction { + final RethinkDb _rethinkdb; + + RqlDoFunction(this._rethinkdb); + + FunCall call(arg, [expr]) { + return FunCall(arg, expr); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List args = List.from(invocation.positionalArguments); + return _rethinkdb.rqlDo(args.sublist(0, args.length - 1), args.last); + } +} + +class SubFunction { + final RqlQuery? _rqlQuery; + + SubFunction([this._rqlQuery]); + + Sub call(number) { + if (_rqlQuery != null) { + return Sub([_rqlQuery, number]); + } else if (number is Args) { + return Sub([number]); + } else { + throw RqlDriverError("Called sub with too few values"); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + if (_rqlQuery != null) { + positionalArguments.add(_rqlQuery); + } + positionalArguments.addAll(invocation.positionalArguments); + return Sub(positionalArguments); + } +} + +class RethinkDb { +// Connection Management + /// Create a new connection to the database server. Accepts the following options: + /// host: the host to connect to (default localhost). + /// port: the port to connect on (default 28015). + /// db: the default database (defaults to test). + /// user: the user name for the db (defaults to admin). + /// password: password for the user (default ""). + Future connect({ + String db = 'test', + String host = "localhost", + int port = 28015, + String user = "admin", + String password = "", + Map? ssl, + }) => + Connection(db, host, port, user, password, ssl).reconnect(); + + /// Reference a database.This command can be chained with other commands to do further processing on the data. + DB db(String dbName) => DB(dbName); + + /// Create a database. A RethinkDB database is a collection of tables, similar to relational databases. + /// If successful, the operation returns an object: {created: 1}. If a database with the same name already exists the operation throws RqlRuntimeError. + /// Note: that you can only use alphanumeric characters and underscores for the database name. + DbCreate dbCreate(String dbName) => DbCreate(dbName); + + /// Drop a database. The database, all its tables, and corresponding data will be deleted. + /// If successful, the operation returns the object {dropped: 1}. + /// If the specified database doesn't exist a RqlRuntimeError is thrown. + DbDrop dbDrop(String dbName) => DbDrop(dbName, {}); + + /// List all database names in the system. The result is a list of strings. + DbList dbList() => DbList(); + + /// Returns a rang bewteen the start and end values. If no start or + /// end are specified, an 'infinite' stream will be returned. + Range range([start, end]) { + if (start == null) { + return Range.asStream(); + } else if (end == null) { + return Range(start); + } else { + return Range.withStart(start, end); + } + } + + /// Select all documents in a table. This command can be chained with other commands to do further processing on the data. + Table table(String tableName, [Map? options]) => Table(tableName, options); + + /// Create a table. A RethinkDB table is a collection of JSON documents. + /// If successful, the operation returns an object: {created: 1}. If a table with the same name already exists, the operation throws RqlRuntimeError. + /// Note: that you can only use alphanumeric characters and underscores for the table name. + TableCreate tableCreate(String tableName, [Map? options]) => + TableCreate(tableName, options); + + /// List all table names in a database. The result is a list of strings. + TableList tableList() => TableList(); + + /// Drop a table. The table and all its data will be deleted. + TableDrop tableDrop(String tableName, [Map? options]) => + TableDrop(tableName, options); + + /// Specify ascending order on an attribute + Asc asc(String attr) => Asc(attr); + + /// Specify descending order on an attribute + Desc desc(String attr) => Desc(attr); + + /// Create a time object for a specific time. + Time time(int year, int month, int day, + {String timezone = 'Z', int? hour, int? minute, num? second}) { + if (second != null) { + return Time(Args([year, month, day, hour, minute, second, timezone])); + } else { + return Time(Args([year, month, day, timezone])); + } + } + + /// Create a time object from a Dart DateTime object. + /// + Time nativeTime(DateTime val) => expr(val); + + /// Create a time object based on an iso8601 date-time string (e.g. '2013-01-01T01:01:01+00:00'). + /// We support all valid ISO 8601 formats except for week dates. + /// If you pass an ISO 8601 date-time without a time zone, you must specify the time zone with the optarg default_timezone. + /// + // ignore: non_constant_identifier_names + RqlISO8601 ISO8601(String stringTime, [defaultTimeZone = "Z"]) => + RqlISO8601(stringTime, defaultTimeZone); + + /// Create a time object based on seconds since epoch. + /// The first argument is a double and will be rounded to three decimal places (millisecond-precision). + EpochTime epochTime(int eptime) => EpochTime(eptime); + + /// Return a time object representing the current time in UTC. + /// The command now() is computed once when the server receives the query, so multiple instances of r.now() will always return the same time inside a query. + Now now() => Now(); + + /// Evaluate the expr in the context of one or more value bindings. + /// The type of the result is the type of the value returned from expr. + dynamic get rqlDo => RqlDoFunction(this); + + /// If the test expression returns false or null, the [falseBranch] will be executed. + /// In the other cases, the [trueBranch] is the one that will be evaluated. + dynamic get branch => BranchFunction(); + + /// Throw a runtime error. If called with no arguments inside the second argument to default, re-throw the current error. + UserError error(String message) => UserError(message, {}); + + /// Create a javascript expression. + JavaScript js(String js, [Map? options]) => JavaScript(js, options); + + /// Parse a JSON string on the server. + Json json(String json) => Json(json, {}); + + /// Count the total size of the group. + Map count = {"COUNT": true}; + + /// Compute the sum of the given field in the group. + Map sum(String attr) => {'SUM': attr}; + + /// Compute the average value of the given attribute for the group. + Map avg(String attr) => {"AVG": attr}; + + /// Returns the currently visited document. + ImplicitVar row = ImplicitVar(); + + /// Adds fields to an object + + dynamic get object => ObjectFunction(this); + + /// Acts like the ruby splat operator; unpacks a list of arguments. + Args args(args) => Args(args); + + /// Returns data from a specified http url + Http http(url, [optargs]) => Http(url, optargs); + + /// Generates a random number between two bounds + Random random([left, right, options]) { + if (right != null) { + return Random.rightBound(left, right, options); + } else if (left != null) { + return Random.leftBound(left, options); + } else { + return Random(options); + } + } + + /// Returns logical inverse of the arguments given + Not not([value]) => Not(value ?? true); + + /// Executes the mappingFunction for each item in a sequence or array + /// and returns the transformed array. multiple sequences and arrays + /// may be passed + dynamic get map => MapFunction(); + + /// computes logical 'and' of two or more values + dynamic get and => AndFunction(); + + /// computes logical 'or' of two or more values + dynamic get or => OrFunction(); + + /// Replace an object in a field instead of merging it with an existing object in a [merge] or [update] operation. + Literal literal(args) => Literal(args); + + /// Convert native dart object into a RqlObject + expr(val) => RqlQuery()._expr(val); + + /// Convert a GeoJSON object to a ReQL geometry object. + GeoJson geojson(Map geoJson) => GeoJson(geoJson); + + /// Construct a circular line or polygon. + Circle circle(point, num radius, [Map? options]) => + Circle(point, radius, options); + + /// Compute the distance between a point and a geometry object + + Distance distance(geo1, geo2, [Map? options]) => + Distance(geo1, geo2, options); + + /// Construct a geometric line + dynamic get line => LineFunction(); + + /// Construct a geometric point + Point point(num long, num lat) => Point(long, lat); + + /// Construct a geometric polygon + dynamic get polygon => PolygonFunction(); + + dynamic get eq => EqFunction(); + + dynamic get ne => NeFunction(); + + dynamic get lt => LtFunction(); + + dynamic get le => LeFunction(); + + dynamic get gt => GtFunction(); + + dynamic get ge => GeFunction(); + + dynamic get add => AddFunction(); + + dynamic get sub => SubFunction(); + + dynamic get mul => MulFunction(); + + dynamic get div => DivFunction(); + + /// Encapsulate binary data within a query. + Binary binary(var data) => Binary(data); + + RqlTimeName monday = RqlTimeName(p.Term_TermType.MONDAY); + RqlTimeName tuesday = RqlTimeName(p.Term_TermType.TUESDAY); + RqlTimeName wednesday = RqlTimeName(p.Term_TermType.WEDNESDAY); + RqlTimeName thursday = RqlTimeName(p.Term_TermType.THURSDAY); + RqlTimeName friday = RqlTimeName(p.Term_TermType.FRIDAY); + RqlTimeName saturday = RqlTimeName(p.Term_TermType.SATURDAY); + RqlTimeName sunday = RqlTimeName(p.Term_TermType.SUNDAY); + + RqlTimeName january = RqlTimeName(p.Term_TermType.JANUARY); + RqlTimeName february = RqlTimeName(p.Term_TermType.FEBRUARY); + RqlTimeName march = RqlTimeName(p.Term_TermType.MARCH); + RqlTimeName april = RqlTimeName(p.Term_TermType.APRIL); + RqlTimeName may = RqlTimeName(p.Term_TermType.MAY); + RqlTimeName june = RqlTimeName(p.Term_TermType.JUNE); + RqlTimeName july = RqlTimeName(p.Term_TermType.JULY); + RqlTimeName august = RqlTimeName(p.Term_TermType.AUGUST); + RqlTimeName september = RqlTimeName(p.Term_TermType.SEPTEMBER); + RqlTimeName october = RqlTimeName(p.Term_TermType.OCTOBER); + RqlTimeName november = RqlTimeName(p.Term_TermType.NOVEMBER); + RqlTimeName december = RqlTimeName(p.Term_TermType.DECEMBER); + + RqlConstant minval = RqlConstant(p.Term_TermType.MINVAL); + RqlConstant maxval = RqlConstant(p.Term_TermType.MAXVAL); + + Uuid uuid([str]) => Uuid(str); +} diff --git a/drivers/rethinkdb/lib/src/ast.dart b/drivers/rethinkdb/lib/src/ast.dart new file mode 100644 index 0000000..4129a59 --- /dev/null +++ b/drivers/rethinkdb/lib/src/ast.dart @@ -0,0 +1,2208 @@ +part of '../platform_driver_rethinkdb.dart'; + +const defaultNestingDepth = 20; + +List buildInvocationParams(List positionalArguments, + [List? optionsNames]) { + List argsList = []; + argsList.addAll(positionalArguments); + Map? options = {}; + if (argsList.length > 1 && argsList.last is Map) { + if (optionsNames == null) { + options = argsList.removeLast(); + } else { + Map lastArgument = argsList.last; + bool isOptions = true; + lastArgument.forEach((key, _) { + if (!optionsNames.contains(key)) { + isOptions = false; + } + }); + if (isOptions) { + options = argsList.removeLast(); + } + } + } + List invocationParams = [argsList]; + if (options!.isNotEmpty) { + invocationParams.add(options); + } + return invocationParams; +} + +// TODO: handle index. +// TODO: handle multi. +class GroupFunction { + final RqlQuery? _rqlQuery; + + GroupFunction([this._rqlQuery]); + + Group call(args) { + if (args is List) { + return Group(_rqlQuery, args, null); + } else { + return Group(_rqlQuery, [args], null); + } + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.addAll(invocation.positionalArguments); + List invocationParams = + buildInvocationParams(positionalArguments, ['index', 'multi']); + return Group(_rqlQuery, invocationParams[0], + invocationParams.length == 2 ? invocationParams[1] : null); + } +} + +class HasFieldsFunction { + final RqlQuery? _rqlQuery; + + HasFieldsFunction([this._rqlQuery]); + + HasFields call(items) { + return HasFields(_rqlQuery, items); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.addAll(invocation.positionalArguments); + return HasFields(_rqlQuery, buildInvocationParams(positionalArguments)); + } +} + +class MergeFunction { + final RqlQuery? _rqlQuery; + + MergeFunction([this._rqlQuery]); + + Merge call(obj) { + return Merge([_rqlQuery, obj]); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.add(_rqlQuery); + positionalArguments.addAll(invocation.positionalArguments); + return Merge(positionalArguments); + } +} + +class PluckFunction { + final RqlQuery? _rqlQuery; + + PluckFunction([this._rqlQuery]); + + Pluck call(args) { + return Pluck(_rqlQuery!._listify(args, _rqlQuery)); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.addAll(invocation.positionalArguments); + return Pluck(_rqlQuery! + ._listify(buildInvocationParams(positionalArguments), _rqlQuery)); + } +} + +// TODO: handle interleave. +class UnionFunction { + final RqlQuery? _rqlQuery; + + UnionFunction([this._rqlQuery]); + + Union call(sequence) { + return Union(_rqlQuery, [sequence]); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.addAll(invocation.positionalArguments); + List invocationParams = + buildInvocationParams(positionalArguments, ['interleave']); + if (invocationParams.length == 2) { + return Union(_rqlQuery, [invocationParams[0], invocationParams[1]]); + } else { + return Union(_rqlQuery, invocationParams[0]); + } + } +} + +class WithoutFunction { + final RqlQuery? _rqlQuery; + + WithoutFunction([this._rqlQuery]); + + Without call(items) { + return Without(_rqlQuery!._listify(items, _rqlQuery)); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.addAll(invocation.positionalArguments); + return Without(_rqlQuery! + ._listify(buildInvocationParams(positionalArguments), _rqlQuery)); + } +} + +class WithFieldsFunction { + final RqlQuery? _rqlQuery; + + WithFieldsFunction([this._rqlQuery]); + + WithFields call(items) { + return WithFields(_rqlQuery, items); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List positionalArguments = []; + positionalArguments.addAll(invocation.positionalArguments); + return WithFields(_rqlQuery, buildInvocationParams(positionalArguments)); + } +} + +class RqlMapFunction { + final RqlQuery _rqlQuery; + + RqlMapFunction(this._rqlQuery); + + call(mappingFunction) { + if (mappingFunction is List) { + mappingFunction.insert(0, _rqlQuery); + var item = _rqlQuery._funcWrap( + mappingFunction.removeLast(), mappingFunction.length); + return RqlMap(mappingFunction, item); + } + return RqlMap([_rqlQuery], mappingFunction); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List mappingFunction = List.from(invocation.positionalArguments); + mappingFunction.insert(0, _rqlQuery); + var item = _rqlQuery._funcWrap( + mappingFunction.removeLast(), mappingFunction.length); + return RqlMap(mappingFunction, item); + } +} + +class RqlQuery { + p.Term_TermType get tt => p.Term_TermType.ERROR; + + List args = []; + Map optargs = {}; + + RqlQuery([List? args, Map? optargs]) { + if (args != null) { + for (var e in args) { + if (_checkIfOptions(e, tt)) { + optargs ??= e; + } else if (e != null) { + this.args.add(_expr(e)); + } + } + } + + if (optargs != null) { + optargs.forEach((k, v) { + if ((k == "conflict") && (v is Function)) { + this.optargs[k] = _expr(v, defaultNestingDepth, 3); + } else { + this.optargs[k] = _expr(v); + } + }); + } + } + + _expr(val, [nestingDepth = defaultNestingDepth, argsCount]) { + if (nestingDepth <= 0) { + throw RqlDriverError("Nesting depth limit exceeded"); + } + + if (nestingDepth is int == false) { + throw RqlDriverError("Second argument to `r.expr` must be a number."); + } + + if (val is RqlQuery) { + return val; + } else if (val is List) { + for (var v in val) { + v = _expr(v, nestingDepth - 1, argsCount); + } + + return MakeArray(val); + } else if (val is Map) { + Map obj = {}; + + val.forEach((k, v) { + obj[k] = _expr(v, nestingDepth - 1, argsCount); + }); + + return MakeObj(obj); + } else if (val is Function) { + return Func(val, argsCount); + } else if (val is DateTime) { + return Time(Args([ + val.year, + val.month, + val.day, + val.hour, + val.minute, + val.second, + _formatTimeZoneOffset(val) + ])); + } + return Datum(val); + } + + String _formatTimeZoneOffset(DateTime val) { + String tz = val.timeZoneOffset.inHours.toString(); + + if (!val.timeZoneOffset.inHours.isNegative) { + tz = "+$tz"; + } + + if (tz.length == 2) { + tz = tz.replaceRange(0, 1, "${tz[0]}0"); + } + + return tz; + } + + Future run(Connection c, [globalOptargs]) { + //if (c == null) { + // throw RqlDriverError("RqlQuery.run must be given a connection to run."); + //} + + return c._start(this, globalOptargs); + } + + //since a term that may take multiple options can now be passed + //one or two, we can't know if the final argument in a query + //is actually an option or just another arg. _check_if_options + //checks if all of the keys in the object are in options + _checkIfOptions(obj, p.Term_TermType? tt) { + if (obj is Map == false) { + return false; + } else { + List? options = _RqlAllOptions(tt).options; + + return obj.keys.every(options!.contains); + } + } + + build() { + List res = []; + + res.add(tt.value); + + List argList = []; + for (var arg in args) { + if (arg != null) { + argList.add(arg.build()); + } + } + res.add(argList); + + if (optargs.isNotEmpty) { + Map optArgsMap = {}; + optargs.forEach((k, v) { + optArgsMap[k] = v.build(); + }); + res.add(optArgsMap); + } + return res; + } + + _recursivelyConvertPseudotypes(obj, formatOpts) { + if (obj is Map) { + obj.forEach((k, v) { + obj[k] = _recursivelyConvertPseudotypes(v, formatOpts); + }); + obj = _convertPseudotype(obj, formatOpts); + } else if (obj is List) { + for (var e in obj) { + e = _recursivelyConvertPseudotypes(e, formatOpts); + } + } + return obj; + } + + _listify(args, [parg]) { + if (args is List) { + args.insert(0, parg); + return args; + } else { + if (args != null) { + if (parg != null) { + return [parg, args]; + } else { + return [args]; + } + } else { + return []; + } + } + } + + bool _ivarScan(query) { + if (query is! RqlQuery) { + return false; + } + + if (query is ImplicitVar) { + return true; + } + if (query.args.any(_ivarScan)) { + return true; + } + + var optArgKeys = query.optargs.values; + + if (optArgKeys.any(_ivarScan)) { + return true; + } + return false; + } + + // Called on arguments that should be functions + _funcWrap(val, [argsCount]) { + val = _expr(val, defaultNestingDepth, argsCount); + if (_ivarScan(val)) { + return Func((x) => val, argsCount); + } + return val; + } + + _reqlTypeTimeToDatetime(Map obj) { + if (obj["epoch_time"] == null) { + throw RqlDriverError( + 'pseudo-type TIME object $obj does not have expected field "epoch_time".'); + } else { + String s = obj["epoch_time"].toString(); + if (s.contains(".")) { + List l = s.split('.'); + while (l[1].length < 3) { + l[1] = l[1] + "0"; + } + s = l.join(""); + } else { + s += "000"; + } + return DateTime.fromMillisecondsSinceEpoch(int.parse(s)); + } + } + + _reqlTypeGroupedDataToObject(Map obj) { + if (obj['data'] == null) { + throw RqlDriverError( + 'pseudo-type GROUPED_DATA object $obj does not have the expected field "data".'); + } + + Map retObj = {}; + obj['data'].forEach((e) { + retObj[e[0]] = e[1]; + }); + return retObj; + } + + _convertPseudotype(Map obj, Map? formatOpts) { + String? reqlType = obj['\$reql_type\$']; + if (reqlType != null) { + if (reqlType == 'TIME') { + if (formatOpts == null || formatOpts.isEmpty) { + formatOpts = {"time_format": "native"}; + } + String timeFormat = formatOpts['time_format']; + if (timeFormat == 'native') { + // Convert to native dart DateTime + return _reqlTypeTimeToDatetime(obj); + } else if (timeFormat != 'raw') { + throw RqlDriverError("Unknown time_format run option $timeFormat."); + } + } else if (reqlType == 'GROUPED_DATA') { + if (formatOpts == null || + formatOpts.isEmpty || + formatOpts['group_format'] == 'native') { + return _reqlTypeGroupedDataToObject(obj); + } else if (formatOpts['group_format'] != 'raw') { + throw RqlDriverError( + "Unknown group_format run option ${formatOpts['group_format']}."); + } + } else if (reqlType == "BINARY") { + if (formatOpts == null || formatOpts["binary_format"] == "native") { + /// the official drivers decode the BASE64 string to binary data + /// this driver currently has a bug with its [_reqlTypeBinaryToBytes] + /// for some reason, when trying to convert the index function for + /// `indexWait` commands, we get a FormatException. + /// so for the short term we will just return the BASE64 string + /// with a TODO to find out what is wrong and fix it. + + try { + return _reqlTypeBinaryToBytes(obj); + } on FormatException { + return obj['data']; + } + } else { + throw RqlDriverError( + "Unknown binary_format run option: ${formatOpts["binary_format"]}"); + } + } else if (reqlType == "GEOMETRY") { + obj.remove('\$reql_type\$'); + return obj; + } else { + throw RqlDriverError("Unknown pseudo-type $reqlType"); + } + } + + return obj; + } + + _reqlTypeBinaryToBytes(Map obj) { + return base64.decode(obj['data']); + } + + Update update(args, [options]) => Update(this, _funcWrap(args, 1), options); + + // Comparison operators + dynamic get eq => EqFunction(this); + + dynamic get ne => NeFunction(this); + + dynamic get lt => LtFunction(this); + + dynamic get le => LeFunction(this); + + dynamic get gt => GtFunction(this); + + dynamic get ge => GeFunction(this); + + // Numeric operators + Not not() => Not(this); + + dynamic get add => AddFunction(this); + + dynamic get sub => SubFunction(this); + + dynamic get mul => MulFunction(this); + + dynamic get div => DivFunction(this); + + Mod mod(other) => Mod(this, other); + + dynamic get and => AndFunction(this); + + dynamic get or => OrFunction(this); + + Contains contains(args) => Contains(this, _funcWrap(args, 1)); + + dynamic get hasFields => HasFieldsFunction(this); + + dynamic get withFields => WithFieldsFunction(this); + + Keys keys() => Keys(this); + + Values values() => Values(this); + + Changes changes([Map? opts]) => Changes(this, opts); + + // Polymorphic object/sequence operations + dynamic get pluck => PluckFunction(this); + + dynamic get without => WithoutFunction(this); + + FunCall rqlDo(arg, [expression]) { + if (expression == null) { + return FunCall(this, _funcWrap(arg, 1)); + } else { + return FunCall(_listify(arg, this), _funcWrap(expression, arg.length)); + } + } + + Default rqlDefault(args) => Default(this, args); + + Replace replace(expr, [options]) => + Replace(this, _funcWrap(expr, 1), options); + + Delete delete([options]) => Delete(this, options); + + // Rql type inspection + coerceTo(String type) => CoerceTo(this, type); + + Ungroup ungroup() => Ungroup(this); + + TypeOf typeOf() => TypeOf(this); + + dynamic get merge => MergeFunction(this); + + Append append(val) => Append(this, val); + + Floor floor() => Floor(this); + + Ceil ceil() => Ceil(this); + + Round round() => Round(this); + + Prepend prepend(val) => Prepend(this, val); + + Difference difference(List ar) => Difference(this, ar); + + SetInsert setInsert(val) => SetInsert(this, val); + + SetUnion setUnion(ar) => SetUnion(this, ar); + + SetIntersection setIntersection(ar) => SetIntersection(this, ar); + + SetDifference setDifference(ar) => SetDifference(this, ar); + + GetField getField(index) => GetField(this, index); + + Nth nth(int index) => Nth(this, index); + + Match match(String regex) => Match(this, regex); + + Split split([seperator = " ", maxSplits]) => + Split(this, seperator, maxSplits); + + Upcase upcase() => Upcase(this); + + Downcase downcase() => Downcase(this); + + IsEmpty isEmpty() => IsEmpty(this); + + Slice slice(int start, [end, Map? options]) => + Slice(this, start, end, options!); + + Fold fold(base, function, [options]) => Fold(this, base, function, options); + + Skip skip(int i) => Skip(this, i); + + Limit limit(int i) => Limit(this, i); + + Reduce reduce(reductionFunction, [base]) => + Reduce(this, _funcWrap(reductionFunction, 2), base); + + Sum sum([args]) => Sum(this, args); + + Avg avg([args]) => Avg(this, args); + + Min min([args]) => Min(this, args); + + Max max([args]) => Max(this, args); + + dynamic get map => RqlMapFunction(this); + + Filter filter(expr, [options]) => Filter(this, _funcWrap(expr, 1), options); + + ConcatMap concatMap(mappingFunction) => + ConcatMap(this, _funcWrap(mappingFunction, 1)); + + Get get(id) => Get(this, id); + + OrderBy orderBy(attrs, [index]) { + if (attrs is Map && attrs.containsKey("index")) { + index = attrs; + attrs = []; + + index.forEach((k, ob) { + if (ob is Asc || ob is Desc) { + //do nothing + } else { + ob = _funcWrap(ob, 1); + } + }); + } else if (attrs is List) { + if (index is Map == false && index != null) { + attrs.add(index); + index = null; + } + for (var ob in attrs) { + if (ob is Asc || ob is Desc) { + //do nothing + } else { + ob = _funcWrap(ob, 1); + } + } + } else { + List tmp = []; + tmp.add(attrs); + if (index is Map == false && index != null) { + tmp.add(index); + index = null; + } + attrs = tmp; + } + + return OrderBy(_listify(attrs, this), index); + } + + operator +(other) => add(other); + operator -(other) => sub(other); + operator *(other) => mul(other); + operator /(other) => div(other); + // TODO see if we can still do this. != isn't assignable so maybe + // it makes more sense not to do == anyway. + //operator ==(other) => this.eq(other); + operator <=(other) => le(other); + operator >=(other) => ge(other); + operator <(other) => lt(other); + operator >(other) => gt(other); + operator %(other) => mod(other); + operator [](attr) => pluck(attr); + + Between between(lowerKey, [upperKey, options]) => + Between(this, lowerKey, upperKey, options); + + Distinct distinct() => Distinct(this); + + Count count([filter]) { + if (filter == null) return Count(this); + return Count(this, _funcWrap(filter, 1)); + } + + dynamic get union => UnionFunction(this); + + InnerJoin innerJoin(otherSequence, [predicate]) => + InnerJoin(this, otherSequence, predicate); + + OuterJoin outerJoin(otherSequence, [predicate]) => + OuterJoin(this, otherSequence, predicate); + + EqJoin eqJoin(leftAttr, [otherTable, options]) => + EqJoin(this, _funcWrap(leftAttr, 1), otherTable, options); + + Zip zip() => Zip(this); + + dynamic get group => GroupFunction(this); + + ForEach forEach(writeQuery) => ForEach(this, _funcWrap(writeQuery, 1)); + + Info info() => Info(this); + + //Array only operations + + InsertAt insertAt(index, [value]) => InsertAt(this, index, value); + + SpliceAt spliceAt(index, [ar]) => SpliceAt(this, index, ar); + + DeleteAt deleteAt(index, [end]) => DeleteAt(this, index, end); + + ChangeAt changeAt(index, [value]) => ChangeAt(this, index, value); + + Sample sample(int i) => Sample(this, i); + + // Time support + ToISO8601 toISO8601() => ToISO8601(this); + + ToEpochTime toEpochTime() => ToEpochTime(this); + + During during(start, [end, options]) => During(this, start, end, options); + + Date date() => Date(this); + + TimeOfDay timeOfDay() => TimeOfDay(this); + + Timezone timezone() => Timezone(this); + + Year year() => Year(this); + + Month month() => Month(this); + + Day day() => Day(this); + + DayOfWeek dayOfWeek() => DayOfWeek(this); + + DayOfYear dayOfYear() => DayOfYear(this); + + Hours hours() => Hours(this); + + Minutes minutes() => Minutes(this); + + Seconds seconds() => Seconds(this); + + InTimezone inTimezone(tz) => InTimezone(this, tz); + + Binary binary(data) => Binary(data); + + Distance distance(geo, [opts]) => Distance(this, geo, opts); + + Fill fill() => Fill(this); + + ToGeoJson toGeojson() => ToGeoJson(this); + + GetIntersecting getIntersecting(geo, Map? options) => + GetIntersecting(this, geo, options); + + GetNearest getNearest(point, [Map? options]) => + GetNearest(this, point, options!); + + Includes includes(geo) => Includes(this, geo); + + Intersects intersects(geo) => Intersects(this, geo); + + PolygonSub polygonSub(var poly) => PolygonSub(this, poly); + + Config config() => Config(this); + + Rebalance rebalance() => Rebalance(this); + + Reconfigure reconfigure(Map? options) => Reconfigure(this, options); + + Status status() => Status(this); + + Wait wait([Map? options]) => Wait(this, options!); + + call(attr) { + return GetField(this, attr); + } +} + +//TODO write pretty compose functions +class RqlBoolOperQuery extends RqlQuery { + RqlBoolOperQuery([super.args, super.optargs]); +} + +class RqlBiOperQuery extends RqlQuery { + RqlBiOperQuery([super.args, super.optargs]); +} + +class RqlBiCompareOperQuery extends RqlBiOperQuery { + RqlBiCompareOperQuery([super.args, super.optargs]); +} + +class RqlTopLevelQuery extends RqlQuery { + RqlTopLevelQuery([super.args, super.optargs]); +} + +class RqlMethodQuery extends RqlQuery { + RqlMethodQuery([super.args, super.optargs]); +} + +class RqlBracketQuery extends RqlMethodQuery { + RqlBracketQuery([super.args, super.optargs]); +} + +class Datum extends RqlQuery { + dynamic data; + + Datum(val) : super(null, null) { + data = val; + } + + @override + build() { + return data; + } +} + +class MakeArray extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MAKE_ARRAY; + + MakeArray(super.args); +} + +class MakeObj extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MAKE_OBJ; + + MakeObj(objDict) : super(null, objDict); + + @override + build() { + var res = {}; + optargs.forEach((k, v) { + res[k is RqlQuery ? k.build() : k] = v is RqlQuery ? v.build() : v; + }); + return res; + } +} + +class Var extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.VAR; + + Var(args) : super([args]); + + @override + call(attr) => GetField(this, attr); +} + +class JavaScript extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.JAVASCRIPT; + + JavaScript(args, [optargs]) : super([args], optargs); +} + +class Http extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.HTTP; + + Http(args, [optargs]) : super([args], optargs); +} + +class UserError extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ERROR; + + UserError(args, [optargs]) : super([args], optargs); +} + +class Random extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.RANDOM; + + Random(optargs) : super(null, optargs ?? {}); + + Random.leftBound(left, optargs) : super([left], optargs ?? {}); + + Random.rightBound(left, right, optargs) : super([left, right], optargs ?? {}); +} + +class Changes extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CHANGES; + + Changes([arg, opts]) : super([arg], opts); +} + +class Fold extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FOLD; + + Fold(seq, base, func, [opts]) : super([seq, base, func], opts); +} + +class Grant extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GRANT; + + Grant([scope, user, options]) : super([scope, user], options); +} + +class Default extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DEFAULT; + + Default(obj, value) : super([obj, value]); +} + +class ImplicitVar extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.IMPLICIT_VAR; + + ImplicitVar() : super(); + + @override + call(attr) => GetField(this, attr); +} + +class Eq extends RqlBiCompareOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.EQ; + + Eq(super.numbers); +} + +class Ne extends RqlBiCompareOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.NE; + + Ne(super.numbers); +} + +class Lt extends RqlBiCompareOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.LT; + + Lt(super.numbers); +} + +class Le extends RqlBiCompareOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.LE; + + Le(super.numbers); +} + +class Gt extends RqlBiCompareOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GT; + + Gt(super.numbers); +} + +class Ge extends RqlBiCompareOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GE; + + Ge(super.numbers); +} + +class Not extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.NOT; + + Not([args]) : super([args]); +} + +class Add extends RqlBiOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ADD; + + Add(super.objects); +} + +class Sub extends RqlBiOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SUB; + + Sub(super.numbers); +} + +class Mul extends RqlBiOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MUL; + + Mul(super.numbers); +} + +class Div extends RqlBiOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DIV; + + Div(super.numbers); +} + +class Mod extends RqlBiOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MOD; + + Mod(modable, obj) : super([modable, obj]); +} + +class Append extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.APPEND; + + Append(ar, val) : super([ar, val]); +} + +class Floor extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FLOOR; + + Floor(ar) : super([ar]); +} + +class Ceil extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CEIL; + + Ceil(ar) : super([ar]); +} + +class Round extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ROUND; + + Round(ar) : super([ar]); +} + +class Prepend extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.PREPEND; + + Prepend(ar, val) : super([ar, val]); +} + +class Difference extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DIFFERENCE; + + Difference(diffable, ar) : super([diffable, ar]); +} + +class SetInsert extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SET_INSERT; + + SetInsert(ar, val) : super([ar, val]); +} + +class SetUnion extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SET_UNION; + + SetUnion(un, val) : super([un, val]); +} + +class SetIntersection extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SET_INTERSECTION; + + SetIntersection(inter, ar) : super([inter, ar]); +} + +class SetDifference extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SET_DIFFERENCE; + + SetDifference(diff, ar) : super([diff, ar]); +} + +class Slice extends RqlBracketQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SLICE; + + Slice(selection, int start, [end, Map? options]) + : super([selection, start, end], options); +} + +class Skip extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SKIP; + + Skip(selection, int number) : super([selection, number]); +} + +class Limit extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.LIMIT; + + Limit(selection, int number) : super([selection, number]); +} + +class GetField extends RqlBracketQuery { + @override + p.Term_TermType get tt => p.Term_TermType.BRACKET; + + GetField(obj, field) : super([obj, field]); +} + +class Contains extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CONTAINS; + + Contains(tbl, items) : super([tbl, items]); +} + +class HasFields extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.HAS_FIELDS; + + HasFields(obj, items) : super([obj, items]); +} + +class WithFields extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.WITH_FIELDS; + + WithFields(obj, fields) : super([obj, fields]); +} + +class Keys extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.KEYS; + + Keys(obj) : super([obj]); +} + +class Values extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.VALUES; + + Values(obj) : super([obj]); +} + +class RqlObject extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.OBJECT; + + RqlObject(super.args); +} + +class Pluck extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.PLUCK; + + Pluck(super.items); +} + +class Without extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.WITHOUT; + + Without(super.items); +} + +class Merge extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MERGE; + + Merge(super.objects); +} + +class Between extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.BETWEEN; + + Between(tbl, lower, upper, [options]) : super([tbl, lower, upper], options); +} + +class DB extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DB; + + DB(String dbName) : super([dbName]); + + TableList tableList() => TableList(this); + + TableCreate tableCreate(String tableName, [Map? options]) => + TableCreate.fromDB(this, tableName, options); + + TableDrop tableDrop(String tableName) => TableDrop.fromDB(this, tableName); + + Table table(String tableName, [Map? options]) => + Table.fromDB(this, tableName, options); + + Grant grant(String user, [Map? options]) => Grant(this, user, options); +} + +class FunCall extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FUNCALL; + + FunCall(argslist, expression) : super() { + List temp = []; + temp.add(expression); + int argsCount; + if (argslist is List) { + argsCount = argslist.length; + temp.addAll(argslist); + } else { + argsCount = 1; + temp.add(argslist); + } + + args.addAll(temp.map((arg) { + return _expr(arg, defaultNestingDepth, argsCount); + })); + } +} + +class GetAllFunction extends RqlQuery { + final Table _table; + + GetAllFunction(this._table); + + @override + GetAll call(attr, [options]) { + if (options != null && options is Map == false) { + attr = _listify(attr, _table); + options = attr.add(options); + return GetAll(attr, options); + } + return GetAll(_listify(attr, _table), options); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List argsList = []; + argsList.addAll(invocation.positionalArguments); + return Function.apply(call, [argsList]); + } +} + +class IndexStatusFunction extends RqlQuery { + final Table _table; + + IndexStatusFunction(this._table); + + @override + IndexStatus call([attr]) { + if (attr == null) { + return IndexStatus.all(_table); + } + return IndexStatus(_table, attr); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List argsList = []; + argsList.addAll(invocation.positionalArguments); + return Function.apply(call, [argsList]); + } +} + +class IndexWaitFunction extends RqlQuery { + final Table _table; + + IndexWaitFunction(this._table); + + @override + IndexWait call([attr]) { + if (attr == null) { + return IndexWait.all(_table); + } + return IndexWait(_table, attr); + } + + @override + dynamic noSuchMethod(Invocation invocation) { + List argsList = []; + argsList.addAll(invocation.positionalArguments); + return Function.apply(call, [argsList]); + } +} + +class Table extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TABLE; + + Table(String tableName, [Map? options]) : super([tableName], options); + + Table.fromDB(DB db, String tableName, [Map? options]) + : super([db, tableName], options); + + Insert insert(records, [options]) => Insert(this, records, options); + + Grant grant(user, [options]) => Grant(this, user, options); + + IndexList indexList() => IndexList(this); + + IndexCreate indexCreate(indexName, [indexFunction, Map? options]) { + if (indexFunction == null && options == null) { + return IndexCreate(this, indexName); + } else if (indexFunction != null && indexFunction is Map) { + return IndexCreate(this, indexName, indexFunction); + } + return IndexCreate.withIndexFunction( + this, indexName, _funcWrap(indexFunction, 1), options); + } + + IndexDrop indexDrop(indexName) => IndexDrop(this, indexName); + + IndexRename indexRename(oldName, newName, [Map? options]) => + IndexRename(this, oldName, newName, options); + + dynamic get indexStatus => IndexStatusFunction(this); + + dynamic get indexWait => IndexWaitFunction(this); + + @override + Update update(args, [options]) => Update(this, _funcWrap(args, 1), options); + + Sync sync() => Sync(this); + + dynamic get getAll => GetAllFunction(this); + + @override + InnerJoin innerJoin(otherSequence, [predicate]) => + InnerJoin(this, otherSequence, predicate); +} + +class Get extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GET; + + Get(table, key) : super([table, key]); + + @override + call(attr) { + return GetField(this, attr); + } +} + +class GetAll extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GET_ALL; + + GetAll(super.keys, [super.options]); + + @override + call(attr) { + return GetField(this, attr); + } +} + +class Reduce extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.REDUCE; + + Reduce(seq, reductionFunction, [base]) + : super([seq, reductionFunction], base); +} + +class Sum extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SUM; + + Sum(obj, args) : super([obj, args]); +} + +class Avg extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.AVG; + + Avg(obj, args) : super([obj, args]); +} + +class Min extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MIN; + + Min(obj, args) : super([obj, args]); +} + +class Max extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MAX; + + Max(obj, args) : super([obj, args]); +} + +class RqlMap extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MAP; + + RqlMap(argslist, expression) : super() { + int argsCount = argslist.length; + List temp = []; + temp.addAll(argslist); + temp.add(_funcWrap(expression, argsCount)); + args.addAll(temp.map((arg) { + return _expr(arg, defaultNestingDepth, argsCount); + })); + } +} + +class Filter extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FILTER; + + Filter(selection, predicate, [Map? options]) + : super([selection, predicate], options); +} + +class ConcatMap extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CONCAT_MAP; + + ConcatMap(seq, mappingFunction) : super([seq, mappingFunction]); +} + +class OrderBy extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ORDER_BY; + + OrderBy(super.args, [super.options]); +} + +class Distinct extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DISTINCT; + + Distinct(sequence) : super([sequence]); +} + +class Count extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.COUNT; + + Count([seq, filter]) : super([seq, filter]); +} + +class Union extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.UNION; + + Union(first, second) : super([first, second]); +} + +class Nth extends RqlBracketQuery { + @override + p.Term_TermType get tt => p.Term_TermType.NTH; + + Nth(selection, int index) : super([selection, index]); +} + +class Match extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MATCH; + + Match(obj, regex) : super([obj, regex]); +} + +class Split extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SPLIT; + + Split(tbl, [obj, maxSplits]) : super([tbl, obj, maxSplits]); +} + +class Upcase extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.UPCASE; + + Upcase(obj) : super([obj]); +} + +class Downcase extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DOWNCASE; + + Downcase(obj) : super([obj]); +} + +class OffsetsOf extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.OFFSETS_OF; + + OffsetsOf(seq, index) : super([seq, index]); +} + +class IsEmpty extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.IS_EMPTY; + + IsEmpty(selection) : super([selection]); +} + +class Group extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GROUP; + + Group(obj, groups, [options]) : super([obj, ...groups], options); +} + +class InnerJoin extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INNER_JOIN; + + InnerJoin(first, second, predicate) : super([first, second, predicate]); +} + +class OuterJoin extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.OUTER_JOIN; + + OuterJoin(first, second, predicate) : super([first, second, predicate]); +} + +class EqJoin extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.EQ_JOIN; + + EqJoin(first, second, predicate, [Map? options]) + : super([first, second, predicate], options); +} + +class Zip extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ZIP; + + Zip(seq) : super([seq]); +} + +class CoerceTo extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.COERCE_TO; + + CoerceTo(obj, String type) : super([obj, type]); +} + +class Ungroup extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.UNGROUP; + + Ungroup(obj) : super([obj]); +} + +class TypeOf extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TYPE_OF; + + TypeOf(obj) : super([obj]); +} + +class Update extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.UPDATE; + + Update(tbl, expression, [Map? options]) : super([tbl, expression], options); +} + +class Delete extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DELETE; + + Delete(selection, [Map? options]) : super([selection], options); +} + +class Replace extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.REPLACE; + + Replace(table, expression, [options]) : super([table, expression], options); +} + +class Insert extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INSERT; + + Insert(table, records, [Map? options]) : super([table, records], options); +} + +class DbCreate extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DB_CREATE; + + DbCreate(String dbName, [Map? options]) : super([dbName], options); +} + +class DbDrop extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DB_DROP; + + DbDrop(String dbName, [Map? options]) : super([dbName], options); +} + +class DbList extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DB_LIST; + + DbList() : super(); +} + +class Range extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.RANGE; + + Range(end) : super([end]); + + Range.asStream() : super(); + + Range.withStart(start, end) : super([start, end]); +} + +class TableCreate extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TABLE_CREATE; + + TableCreate(table, [Map? options]) : super([table], options); + + TableCreate.fromDB(db, table, [Map? options]) : super([db, table], options); +} + +class TableDrop extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TABLE_DROP; + + TableDrop(tbl, [Map? options]) : super([tbl], options); + + TableDrop.fromDB(db, tbl, [Map? options]) : super([db, tbl], options); +} + +class TableList extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TABLE_LIST; + + TableList([db]) : super(db == null ? [] : [db]); +} + +class IndexCreate extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INDEX_CREATE; + + IndexCreate(tbl, index, [Map? options]) : super([tbl, index], options); + + IndexCreate.withIndexFunction(tbl, index, [indexFunction, Map? options]) + : super([tbl, index, indexFunction], options); +} + +class IndexDrop extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INDEX_DROP; + + IndexDrop(table, index) : super([table, index]); +} + +class IndexRename extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INDEX_RENAME; + + IndexRename(table, oldName, newName, options) + : super([table, oldName, newName], options); +} + +class IndexList extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INDEX_LIST; + + IndexList(table) : super([table]); +} + +class IndexStatus extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INDEX_STATUS; + + IndexStatus(tbl, indexList) + : super([tbl, indexList is List ? Args(indexList) : indexList]); + IndexStatus.all(tbl) : super([tbl]); +} + +class IndexWait extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INDEX_WAIT; + + IndexWait(tbl, indexList) + : super([tbl, indexList is List ? Args(indexList) : indexList]); + IndexWait.all(tbl) : super([tbl]); +} + +class Sync extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SYNC; + + Sync(table) : super([table]); +} + +class Branch extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.BRANCH; + + Branch(predicate, trueBranch, falseBranch) + : super([predicate, trueBranch, falseBranch]); + Branch.fromArgs(Args args) : super([args]); +} + +class Or extends RqlBoolOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.OR; + + Or(super.orables); +} + +class And extends RqlBoolOperQuery { + @override + p.Term_TermType get tt => p.Term_TermType.AND; + + And(super.andables); +} + +class ForEach extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FOR_EACH; + + ForEach(obj, writeQuery) : super([obj, writeQuery]); +} + +class Info extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INFO; + + Info(knowable) : super([knowable]); +} + +class InsertAt extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INSERT_AT; + + InsertAt(ar, index, value) : super([ar, index, value]); +} + +class SpliceAt extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SPLICE_AT; + + SpliceAt(ar, index, value) : super([ar, index, value]); +} + +class DeleteAt extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DELETE_AT; + + DeleteAt(ar, index, value) : super([ar, index, value]); +} + +class ChangeAt extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CHANGE_AT; + + ChangeAt(ar, index, value) : super([ar, index, value]); +} + +class Sample extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SAMPLE; + + Sample(selection, int i) : super([selection, i]); +} + +class Uuid extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.UUID; + Uuid(str) : super(str == null ? [] : [str]); +} + +class Json extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.JSON; + + Json(String jsonString, [Map? options]) : super([jsonString], options); +} + +class Args extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ARGS; + + Args(List array) : super([array]); +} + +class ToISO8601 extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TO_ISO8601; + + ToISO8601(obj) : super([obj]); +} + +class During extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DURING; + + During(obj, start, end, [Map? options]) : super([obj, start, end], options); +} + +class Date extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DATE; + + Date(obj) : super([obj]); +} + +class TimeOfDay extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TIME_OF_DAY; + + TimeOfDay(obj) : super([obj]); +} + +class Timezone extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TIMEZONE; + + Timezone(zone) : super([zone]); +} + +class Year extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.YEAR; + + Year(year) : super([year]); +} + +class Month extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MONTH; + + Month(month) : super([month]); +} + +class Day extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DAY; + + Day(day) : super([day]); +} + +class DayOfWeek extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DAY_OF_WEEK; + + DayOfWeek(dow) : super([dow]); +} + +class DayOfYear extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DAY_OF_YEAR; + + DayOfYear(doy) : super([doy]); +} + +class Hours extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.HOURS; + + Hours(hours) : super([hours]); +} + +class Minutes extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.MINUTES; + + Minutes(minutes) : super([minutes]); +} + +class Seconds extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.SECONDS; + + Seconds(seconds) : super([seconds]); +} + +class Binary extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.BINARY; + Binary(data) : super([data]); +} + +class Time extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TIME; + + Time(Args args) : super([args]); + + Time.withHour(int year, int month, int day, String timezone, int hour) + : super([year, month, day, hour, timezone]); + + Time.withMinute( + int year, int month, int day, String timezone, int hour, int minute) + : super([year, month, day, hour, minute, timezone]); + + Time.withSecond(int year, int month, int day, String timezone, int hour, + int minute, int second) + : super([year, month, day, hour, minute, second, timezone]); +} + +class RqlISO8601 extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ISO8601; + + RqlISO8601(strTime, [defaultTimeZone = "Z"]) + : super([strTime], {"default_timezone": defaultTimeZone}); +} + +class EpochTime extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.EPOCH_TIME; + + EpochTime(eptime) : super([eptime]); +} + +class Now extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.NOW; + + Now() : super(); +} + +class InTimezone extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.IN_TIMEZONE; + + InTimezone(zoneable, tz) : super([zoneable, tz]); +} + +class ToEpochTime extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TO_EPOCH_TIME; + + ToEpochTime(obj) : super([obj]); +} + +class Func extends RqlQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FUNC; + Function fun; + int argsCount; + static int nextId = 0; + Func(this.fun, this.argsCount) : super(null, null) { + List vrs = []; + List vrids = []; + + for (int i = 0; i < argsCount; i++) { + vrs.add(Var(Func.nextId)); + vrids.add(Func.nextId); + Func.nextId++; + } + + args = [MakeArray(vrids), _expr(Function.apply(fun, vrs))]; + } +} + +class Asc extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.ASC; + + Asc(obj) : super([obj]); +} + +class Desc extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DESC; + + Desc(attr) : super([attr]); +} + +class Literal extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.LITERAL; + + Literal(attr) : super([attr]); +} + +class Circle extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CIRCLE; + + Circle(point, radius, [Map? options]) : super([point, radius], options); +} + +class Distance extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.DISTANCE; + + Distance(obj, geo, [Map? options]) : super([obj, geo], options); +} + +class Fill extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.FILL; + + Fill(obj) : super([obj]); +} + +class GeoJson extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GEOJSON; + + GeoJson(Map geoJson) : super([geoJson]); +} + +class ToGeoJson extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.TO_GEOJSON; + + ToGeoJson(obj) : super([obj]); +} + +class GetIntersecting extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GET_INTERSECTING; + + GetIntersecting(table, geo, [Map? options]) : super([table, geo], options); +} + +class GetNearest extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.GET_NEAREST; + + GetNearest(table, point, Map? options) : super([table, point], options); +} + +class Includes extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INCLUDES; + + Includes(obj, geo) : super([obj, geo]); +} + +class Intersects extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.INTERSECTS; + + Intersects(obj, geo) : super([obj, geo]); +} + +class Line extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.LINE; + + Line(super.points); +} + +class Point extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.POINT; + + Point(long, lat) : super([long, lat]); +} + +class Polygon extends RqlTopLevelQuery { + @override + p.Term_TermType get tt => p.Term_TermType.POLYGON; + + Polygon(super.points); +} + +class PolygonSub extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.POLYGON_SUB; + + PolygonSub(var poly1, var poly2) : super([poly1, poly2]); +} + +class Config extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.CONFIG; + + Config(obj) : super([obj]); +} + +class Rebalance extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.REBALANCE; + + Rebalance(obj) : super([obj]); +} + +class Reconfigure extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.RECONFIGURE; + + Reconfigure(obj, Map? options) : super([obj], options); +} + +class Status extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.STATUS; + + Status(obj) : super([obj]); +} + +class Wait extends RqlMethodQuery { + @override + p.Term_TermType get tt => p.Term_TermType.WAIT; + + Wait(obj, [Map? options]) : super([obj], options); +} + +class RqlTimeName extends RqlQuery { + final p.Term_TermType? _tt; + + @override + p.Term_TermType get tt => _tt!; + + RqlTimeName(this._tt) : super(); +} + +class RqlConstant extends RqlQuery { + final p.Term_TermType? _tt; + + @override + p.Term_TermType get tt => _tt!; + + RqlConstant(this._tt) : super(); +} + +class _RqlAllOptions { + //list of every option from any term + List? options; + + _RqlAllOptions(p.Term_TermType? tt) { + switch (tt) { + case p.Term_TermType.TABLE_CREATE: + options = ['primary_key', 'durability', 'datacenter']; + break; + case p.Term_TermType.INSERT: + options = ['durability', 'return_changes', 'conflict']; + break; + case p.Term_TermType.UPDATE: + options = ['durability', 'return_changes', 'non_atomic']; + break; + case p.Term_TermType.REPLACE: + options = ['durability', 'return_changes', 'non_atomic']; + break; + case p.Term_TermType.DELETE: + options = ['durability', 'return_changes']; + break; + case p.Term_TermType.TABLE: + options = ['read_mode']; + break; + case p.Term_TermType.INDEX_CREATE: + options = ["multi"]; + break; + case p.Term_TermType.GET_ALL: + options = ['index']; + break; + case p.Term_TermType.BETWEEN: + options = ['index', 'left_bound', 'right_bound']; + break; + case p.Term_TermType.FILTER: + options = ['default']; + break; + case p.Term_TermType.CHANGES: + options = ['includeOffsets', 'includeTypes']; + break; + case p.Term_TermType.EQ_JOIN: + options = ['index', 'ordered']; + break; + case p.Term_TermType.UNION: + options = ['interleave']; + break; + case p.Term_TermType.SLICE: + options = ['left_bound', 'right_bound']; + break; + case p.Term_TermType.GROUP: + options = ['index']; + break; + case p.Term_TermType.RANDOM: + options = ['float']; + break; + case p.Term_TermType.ISO8601: + options = ['default_timezone']; + break; + case p.Term_TermType.DURING: + options = ['left_bound', 'right_bound']; + break; + case p.Term_TermType.JAVASCRIPT: + options = ['timeout']; + break; + case p.Term_TermType.HTTP: + options = [ + 'timeout', + 'attempts', + 'redirects', + 'verify', + 'result_format', + 'method', + 'auth', + 'params', + 'header', + 'data', + 'page', + 'page_limit' + ]; + break; + case p.Term_TermType.CIRCLE: + options = ['num_vertices', 'geo_system', 'unit', 'fill']; + break; + case p.Term_TermType.GET_NEAREST: + options = ['index', 'max_results', 'max_dist', 'unit', 'geo_system']; + break; + case p.Term_TermType.RECONFIGURE: + options = [ + 'shards', + 'replicas', + 'primary_replica_tag', + 'dry_run', + "emergency_repair" + ]; + break; + case p.Term_TermType.WAIT: + options = ['wait_for', 'timeout']; + break; + default: + options = []; + } + } +} diff --git a/drivers/rethinkdb/lib/src/cursor.dart b/drivers/rethinkdb/lib/src/cursor.dart new file mode 100644 index 0000000..e45dd5e --- /dev/null +++ b/drivers/rethinkdb/lib/src/cursor.dart @@ -0,0 +1,96 @@ +part of '../platform_driver_rethinkdb.dart'; + +class Cursor extends Stream { + final Connection _conn; + final Query _query; + final Map _opts; + int _outstandingRequests = 0; + bool _endFlag = false; + final StreamController _s = StreamController(); + + Cursor(this._conn, this._query, this._opts); + + _extend(Response response) { + _endFlag = response._type != p.Response_ResponseType.SUCCESS_PARTIAL.value; + + if (response._type != p.Response_ResponseType.SUCCESS_PARTIAL.value && + response._type != p.Response_ResponseType.SUCCESS_SEQUENCE.value) { + _s.addError( + RqlDriverError("Unexpected response type received for cursor"), null); + } + + try { + _conn._checkErrorResponse(response, _query._term); + } catch (e) { + _s.addError(e); + } + + var convertedData = + _query._recursivelyConvertPseudotypes(response._data, _opts); + _s.addStream(Stream.fromIterable(convertedData)).then((f) { + if (!_endFlag) { + _outstandingRequests++; + Query query = + Query(p.Query_QueryType.CONTINUE, _query._token, null, null); + query._cursor = this; + _conn._sendQueue.addLast(query); + _conn._sendQuery(); + } else { + _s.close(); + } + }); + } + + Future close() => _s.close(); + + @override + StreamSubscription listen(Function(dynamic)? onData, + {Function? onError, Function()? onDone, bool? cancelOnError}) { + return _s.stream.listen(onData, + onError: onError, onDone: onDone, cancelOnError: cancelOnError); + } +} + +class Feed extends Cursor { + Feed(super.conn, super.opts, super.query); + + @override + toSet() => throw RqlDriverError("`toSet` is not available for feeds."); + @override + toList() => throw RqlDriverError("`toList` is not available for feeds."); + @override + toString() => "Instance of 'Feed'"; +} + +class UnionedFeed extends Cursor { + UnionedFeed(super.conn, super.opts, super.query); + + @override + toSet() => throw RqlDriverError("`toSet` is not available for feeds."); + @override + toList() => throw RqlDriverError("`toList` is not available for feeds."); + @override + toString() => "Instance of 'UnionedFeed'"; +} + +class AtomFeed extends Cursor { + AtomFeed(super.conn, super.opts, super.query); + + @override + toSet() => throw RqlDriverError("`toSet` is not available for feeds."); + @override + toList() => throw RqlDriverError("`toList` is not available for feeds."); + @override + toString() => "Instance of 'AtomFeed'"; +} + +class OrderByLimitFeed extends Cursor { + OrderByLimitFeed(super.conn, super.opts, super.query); + + @override + toSet() => throw RqlDriverError("`toSet` is not available for feeds."); + @override + toList() => throw RqlDriverError("`toList` is not available for feeds."); + @override + toString() => "Instance of 'OrderByLimitFeed'"; +} diff --git a/drivers/rethinkdb/lib/src/errors.dart b/drivers/rethinkdb/lib/src/errors.dart new file mode 100644 index 0000000..8532b46 --- /dev/null +++ b/drivers/rethinkdb/lib/src/errors.dart @@ -0,0 +1,64 @@ +part of '../platform_driver_rethinkdb.dart'; + +class RqlError implements Exception { + String message; + dynamic term; + dynamic frames; + + RqlError(this.message, this.term, this.frames); + + @override + toString() => "$runtimeType\n\n$message\n\n$term\n\n$frames"; +} + +class RqlClientError extends RqlError { + RqlClientError(super.message, super.term, super.frames); +} + +class RqlCompileError extends RqlError { + RqlCompileError(super.message, super.term, super.frames); +} + +class RqlRuntimeError extends RqlError { + RqlRuntimeError(super.message, super.term, super.frames); +} + +class RqlDriverError implements Exception { + String message; + RqlDriverError(this.message); + + @override + toString() => message; +} + +class ReqlInternalError extends RqlRuntimeError { + ReqlInternalError(super.message, super.term, super.frames); +} + +class ReqlResourceLimitError extends RqlRuntimeError { + ReqlResourceLimitError(super.message, super.term, super.frames); +} + +class ReqlQueryLogicError extends RqlRuntimeError { + ReqlQueryLogicError(super.message, super.term, super.frames); +} + +class ReqlNonExistenceError extends RqlRuntimeError { + ReqlNonExistenceError(super.message, super.term, super.frames); +} + +class ReqlOpFailedError extends RqlRuntimeError { + ReqlOpFailedError(super.message, super.term, super.frames); +} + +class ReqlOpIndeterminateError extends RqlRuntimeError { + ReqlOpIndeterminateError(super.message, super.term, super.frames); +} + +class ReqlUserError extends RqlRuntimeError { + ReqlUserError(super.message, super.term, super.frames); +} + +class ReqlPermissionError extends RqlRuntimeError { + ReqlPermissionError(super.message, super.term, super.frames); +} diff --git a/drivers/rethinkdb/lib/src/generated/ql2.pb.dart b/drivers/rethinkdb/lib/src/generated/ql2.pb.dart new file mode 100644 index 0000000..31fe901 --- /dev/null +++ b/drivers/rethinkdb/lib/src/generated/ql2.pb.dart @@ -0,0 +1,1109 @@ +// Generated code. Do not modify. +// source: ql2.proto +// +// @dart = 2.12 +// ignore_for_file: annotate_overrides,camel_case_types,unnecessary_const,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type,unnecessary_this,prefer_final_fields + +import 'dart:core' as $core; + +import 'package:fixnum/fixnum.dart' as $fixnum; +import 'package:protobuf/protobuf.dart' as $pb; + +import 'ql2.pbenum.dart'; + +export 'ql2.pbenum.dart'; + +class VersionDummy extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'VersionDummy', + createEmptyInstance: create) + ..hasRequiredFields = false; + + VersionDummy._() : super(); + factory VersionDummy() => create(); + factory VersionDummy.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory VersionDummy.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + VersionDummy clone() => VersionDummy()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + VersionDummy copyWith(void Function(VersionDummy) updates) => + super.copyWith((message) => updates(message as VersionDummy)) + as VersionDummy; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static VersionDummy create() => VersionDummy._(); + VersionDummy createEmptyInstance() => create(); + static $pb.PbList createRepeated() => + $pb.PbList(); + @$core.pragma('dart2js:noInline') + static VersionDummy getDefault() => _defaultInstance ??= + $pb.GeneratedMessage.$_defaultFor(create); + static VersionDummy? _defaultInstance; +} + +class Query_AssocPair extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Query.AssocPair', + createEmptyInstance: create) + ..aOS( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'key') + ..aOM( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'val', + subBuilder: Term.create) + ..hasRequiredFields = false; + + Query_AssocPair._() : super(); + factory Query_AssocPair({ + $core.String? key, + Term? val, + }) { + final result = create(); + if (key != null) { + result.key = key; + } + if (val != null) { + result.val = val; + } + return result; + } + factory Query_AssocPair.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Query_AssocPair.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Query_AssocPair clone() => Query_AssocPair()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Query_AssocPair copyWith(void Function(Query_AssocPair) updates) => + super.copyWith((message) => updates(message as Query_AssocPair)) + as Query_AssocPair; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Query_AssocPair create() => Query_AssocPair._(); + Query_AssocPair createEmptyInstance() => create(); + static $pb.PbList createRepeated() => + $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Query_AssocPair getDefault() => _defaultInstance ??= + $pb.GeneratedMessage.$_defaultFor(create); + static Query_AssocPair? _defaultInstance; + + @$pb.TagNumber(1) + $core.String get key => $_getSZ(0); + @$pb.TagNumber(1) + set key($core.String v) { + $_setString(0, v); + } + + @$pb.TagNumber(1) + $core.bool hasKey() => $_has(0); + @$pb.TagNumber(1) + void clearKey() => clearField(1); + + @$pb.TagNumber(2) + Term get val => $_getN(1); + @$pb.TagNumber(2) + set val(Term v) { + setField(2, v); + } + + @$pb.TagNumber(2) + $core.bool hasVal() => $_has(1); + @$pb.TagNumber(2) + void clearVal() => clearField(2); + @$pb.TagNumber(2) + Term ensureVal() => $_ensure(1); +} + +class Query extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Query', + createEmptyInstance: create) + ..e( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'type', + $pb.PbFieldType.OE, + defaultOrMaker: Query_QueryType.START, + valueOf: Query_QueryType.valueOf, + enumValues: Query_QueryType.values) + ..aOM( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'query', + subBuilder: Term.create) + ..aInt64( + 3, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'token') + ..aOB( + 4, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'OBSOLETENoreply', + protoName: 'OBSOLETE_noreply') + ..aOB( + 5, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'acceptsRJson') + ..pc( + 6, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'globalOptargs', + $pb.PbFieldType.PM, + subBuilder: Query_AssocPair.create) + ..hasRequiredFields = false; + + Query._() : super(); + factory Query({ + Query_QueryType? type, + Term? query, + $fixnum.Int64? token, + $core.bool? oBSOLETENoreply, + $core.bool? acceptsRJson, + $core.Iterable? globalOptargs, + }) { + final result = create(); + if (type != null) { + result.type = type; + } + if (query != null) { + result.query = query; + } + if (token != null) { + result.token = token; + } + if (oBSOLETENoreply != null) { + result.oBSOLETENoreply = oBSOLETENoreply; + } + if (acceptsRJson != null) { + result.acceptsRJson = acceptsRJson; + } + if (globalOptargs != null) { + result.globalOptargs.addAll(globalOptargs); + } + return result; + } + factory Query.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Query.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Query clone() => Query()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Query copyWith(void Function(Query) updates) => + super.copyWith((message) => updates(message as Query)) + as Query; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Query create() => Query._(); + Query createEmptyInstance() => create(); + static $pb.PbList createRepeated() => $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Query getDefault() => + _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor(create); + static Query? _defaultInstance; + + @$pb.TagNumber(1) + Query_QueryType get type => $_getN(0); + @$pb.TagNumber(1) + set type(Query_QueryType v) { + setField(1, v); + } + + @$pb.TagNumber(1) + $core.bool hasType() => $_has(0); + @$pb.TagNumber(1) + void clearType() => clearField(1); + + @$pb.TagNumber(2) + Term get query => $_getN(1); + @$pb.TagNumber(2) + set query(Term v) { + setField(2, v); + } + + @$pb.TagNumber(2) + $core.bool hasQuery() => $_has(1); + @$pb.TagNumber(2) + void clearQuery() => clearField(2); + @$pb.TagNumber(2) + Term ensureQuery() => $_ensure(1); + + @$pb.TagNumber(3) + $fixnum.Int64 get token => $_getI64(2); + @$pb.TagNumber(3) + set token($fixnum.Int64 v) { + $_setInt64(2, v); + } + + @$pb.TagNumber(3) + $core.bool hasToken() => $_has(2); + @$pb.TagNumber(3) + void clearToken() => clearField(3); + + @$pb.TagNumber(4) + $core.bool get oBSOLETENoreply => $_getBF(3); + @$pb.TagNumber(4) + set oBSOLETENoreply($core.bool v) { + $_setBool(3, v); + } + + @$pb.TagNumber(4) + $core.bool hasOBSOLETENoreply() => $_has(3); + @$pb.TagNumber(4) + void clearOBSOLETENoreply() => clearField(4); + + @$pb.TagNumber(5) + $core.bool get acceptsRJson => $_getBF(4); + @$pb.TagNumber(5) + set acceptsRJson($core.bool v) { + $_setBool(4, v); + } + + @$pb.TagNumber(5) + $core.bool hasAcceptsRJson() => $_has(4); + @$pb.TagNumber(5) + void clearAcceptsRJson() => clearField(5); + + @$pb.TagNumber(6) + $core.List get globalOptargs => $_getList(5); +} + +class Frame extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Frame', + createEmptyInstance: create) + ..e( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'type', + $pb.PbFieldType.OE, + defaultOrMaker: Frame_FrameType.POS, + valueOf: Frame_FrameType.valueOf, + enumValues: Frame_FrameType.values) + ..aInt64( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'pos') + ..aOS( + 3, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'opt') + ..hasRequiredFields = false; + + Frame._() : super(); + factory Frame({ + Frame_FrameType? type, + $fixnum.Int64? pos, + $core.String? opt, + }) { + final result = create(); + if (type != null) { + result.type = type; + } + if (pos != null) { + result.pos = pos; + } + if (opt != null) { + result.opt = opt; + } + return result; + } + factory Frame.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Frame.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Frame clone() => Frame()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Frame copyWith(void Function(Frame) updates) => + super.copyWith((message) => updates(message as Frame)) + as Frame; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Frame create() => Frame._(); + Frame createEmptyInstance() => create(); + static $pb.PbList createRepeated() => $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Frame getDefault() => + _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor(create); + static Frame? _defaultInstance; + + @$pb.TagNumber(1) + Frame_FrameType get type => $_getN(0); + @$pb.TagNumber(1) + set type(Frame_FrameType v) { + setField(1, v); + } + + @$pb.TagNumber(1) + $core.bool hasType() => $_has(0); + @$pb.TagNumber(1) + void clearType() => clearField(1); + + @$pb.TagNumber(2) + $fixnum.Int64 get pos => $_getI64(1); + @$pb.TagNumber(2) + set pos($fixnum.Int64 v) { + $_setInt64(1, v); + } + + @$pb.TagNumber(2) + $core.bool hasPos() => $_has(1); + @$pb.TagNumber(2) + void clearPos() => clearField(2); + + @$pb.TagNumber(3) + $core.String get opt => $_getSZ(2); + @$pb.TagNumber(3) + set opt($core.String v) { + $_setString(2, v); + } + + @$pb.TagNumber(3) + $core.bool hasOpt() => $_has(2); + @$pb.TagNumber(3) + void clearOpt() => clearField(3); +} + +class Backtrace extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Backtrace', + createEmptyInstance: create) + ..pc( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'frames', + $pb.PbFieldType.PM, + subBuilder: Frame.create) + ..hasRequiredFields = false; + + Backtrace._() : super(); + factory Backtrace({ + $core.Iterable? frames, + }) { + final result = create(); + if (frames != null) { + result.frames.addAll(frames); + } + return result; + } + factory Backtrace.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Backtrace.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Backtrace clone() => Backtrace()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Backtrace copyWith(void Function(Backtrace) updates) => + super.copyWith((message) => updates(message as Backtrace)) + as Backtrace; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Backtrace create() => Backtrace._(); + Backtrace createEmptyInstance() => create(); + static $pb.PbList createRepeated() => $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Backtrace getDefault() => + _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor(create); + static Backtrace? _defaultInstance; + + @$pb.TagNumber(1) + $core.List get frames => $_getList(0); +} + +class Response extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Response', + createEmptyInstance: create) + ..e( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'type', + $pb.PbFieldType.OE, + defaultOrMaker: Response_ResponseType.SUCCESS_ATOM, + valueOf: Response_ResponseType.valueOf, + enumValues: Response_ResponseType.values) + ..aInt64( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'token') + ..pc( + 3, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'response', + $pb.PbFieldType.PM, + subBuilder: Datum.create) + ..aOM( + 4, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'backtrace', + subBuilder: Backtrace.create) + ..aOM( + 5, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'profile', + subBuilder: Datum.create) + ..pc( + 6, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'notes', + $pb.PbFieldType.PE, + valueOf: Response_ResponseNote.valueOf, + enumValues: Response_ResponseNote.values) + ..e( + 7, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'errorType', + $pb.PbFieldType.OE, + defaultOrMaker: Response_ErrorType.INTERNAL, + valueOf: Response_ErrorType.valueOf, + enumValues: Response_ErrorType.values) + ..hasRequiredFields = false; + + Response._() : super(); + factory Response({ + Response_ResponseType? type, + $fixnum.Int64? token, + $core.Iterable? response, + Backtrace? backtrace, + Datum? profile, + $core.Iterable? notes, + Response_ErrorType? errorType, + }) { + final result = create(); + if (type != null) { + result.type = type; + } + if (token != null) { + result.token = token; + } + if (response != null) { + result.response.addAll(response); + } + if (backtrace != null) { + result.backtrace = backtrace; + } + if (profile != null) { + result.profile = profile; + } + if (notes != null) { + result.notes.addAll(notes); + } + if (errorType != null) { + result.errorType = errorType; + } + return result; + } + factory Response.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Response.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Response clone() => Response()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Response copyWith(void Function(Response) updates) => + super.copyWith((message) => updates(message as Response)) + as Response; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Response create() => Response._(); + Response createEmptyInstance() => create(); + static $pb.PbList createRepeated() => $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Response getDefault() => + _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor(create); + static Response? _defaultInstance; + + @$pb.TagNumber(1) + Response_ResponseType get type => $_getN(0); + @$pb.TagNumber(1) + set type(Response_ResponseType v) { + setField(1, v); + } + + @$pb.TagNumber(1) + $core.bool hasType() => $_has(0); + @$pb.TagNumber(1) + void clearType() => clearField(1); + + @$pb.TagNumber(2) + $fixnum.Int64 get token => $_getI64(1); + @$pb.TagNumber(2) + set token($fixnum.Int64 v) { + $_setInt64(1, v); + } + + @$pb.TagNumber(2) + $core.bool hasToken() => $_has(1); + @$pb.TagNumber(2) + void clearToken() => clearField(2); + + @$pb.TagNumber(3) + $core.List get response => $_getList(2); + + @$pb.TagNumber(4) + Backtrace get backtrace => $_getN(3); + @$pb.TagNumber(4) + set backtrace(Backtrace v) { + setField(4, v); + } + + @$pb.TagNumber(4) + $core.bool hasBacktrace() => $_has(3); + @$pb.TagNumber(4) + void clearBacktrace() => clearField(4); + @$pb.TagNumber(4) + Backtrace ensureBacktrace() => $_ensure(3); + + @$pb.TagNumber(5) + Datum get profile => $_getN(4); + @$pb.TagNumber(5) + set profile(Datum v) { + setField(5, v); + } + + @$pb.TagNumber(5) + $core.bool hasProfile() => $_has(4); + @$pb.TagNumber(5) + void clearProfile() => clearField(5); + @$pb.TagNumber(5) + Datum ensureProfile() => $_ensure(4); + + @$pb.TagNumber(6) + $core.List get notes => $_getList(5); + + @$pb.TagNumber(7) + Response_ErrorType get errorType => $_getN(6); + @$pb.TagNumber(7) + set errorType(Response_ErrorType v) { + setField(7, v); + } + + @$pb.TagNumber(7) + $core.bool hasErrorType() => $_has(6); + @$pb.TagNumber(7) + void clearErrorType() => clearField(7); +} + +class Datum_AssocPair extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Datum.AssocPair', + createEmptyInstance: create) + ..aOS( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'key') + ..aOM( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'val', + subBuilder: Datum.create) + ..hasRequiredFields = false; + + Datum_AssocPair._() : super(); + factory Datum_AssocPair({ + $core.String? key, + Datum? val, + }) { + final result = create(); + if (key != null) { + result.key = key; + } + if (val != null) { + result.val = val; + } + return result; + } + factory Datum_AssocPair.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Datum_AssocPair.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Datum_AssocPair clone() => Datum_AssocPair()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Datum_AssocPair copyWith(void Function(Datum_AssocPair) updates) => + super.copyWith((message) => updates(message as Datum_AssocPair)) + as Datum_AssocPair; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Datum_AssocPair create() => Datum_AssocPair._(); + Datum_AssocPair createEmptyInstance() => create(); + static $pb.PbList createRepeated() => + $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Datum_AssocPair getDefault() => _defaultInstance ??= + $pb.GeneratedMessage.$_defaultFor(create); + static Datum_AssocPair? _defaultInstance; + + @$pb.TagNumber(1) + $core.String get key => $_getSZ(0); + @$pb.TagNumber(1) + set key($core.String v) { + $_setString(0, v); + } + + @$pb.TagNumber(1) + $core.bool hasKey() => $_has(0); + @$pb.TagNumber(1) + void clearKey() => clearField(1); + + @$pb.TagNumber(2) + Datum get val => $_getN(1); + @$pb.TagNumber(2) + set val(Datum v) { + setField(2, v); + } + + @$pb.TagNumber(2) + $core.bool hasVal() => $_has(1); + @$pb.TagNumber(2) + void clearVal() => clearField(2); + @$pb.TagNumber(2) + Datum ensureVal() => $_ensure(1); +} + +class Datum extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Datum', + createEmptyInstance: create) + ..e( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'type', + $pb.PbFieldType.OE, + defaultOrMaker: Datum_DatumType.R_NULL, + valueOf: Datum_DatumType.valueOf, + enumValues: Datum_DatumType.values) + ..aOB( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'rBool') + ..a<$core.double>( + 3, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'rNum', + $pb.PbFieldType.OD) + ..aOS( + 4, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'rStr') + ..pc( + 5, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'rArray', + $pb.PbFieldType.PM, + subBuilder: Datum.create) + ..pc( + 6, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'rObject', + $pb.PbFieldType.PM, + subBuilder: Datum_AssocPair.create) + ..hasRequiredFields = false; + + Datum._() : super(); + factory Datum({ + Datum_DatumType? type, + $core.bool? rBool, + $core.double? rNum, + $core.String? rStr, + $core.Iterable? rArray, + $core.Iterable? rObject, + }) { + final result = create(); + if (type != null) { + result.type = type; + } + if (rBool != null) { + result.rBool = rBool; + } + if (rNum != null) { + result.rNum = rNum; + } + if (rStr != null) { + result.rStr = rStr; + } + if (rArray != null) { + result.rArray.addAll(rArray); + } + if (rObject != null) { + result.rObject.addAll(rObject); + } + return result; + } + factory Datum.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Datum.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Datum clone() => Datum()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Datum copyWith(void Function(Datum) updates) => + super.copyWith((message) => updates(message as Datum)) + as Datum; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Datum create() => Datum._(); + Datum createEmptyInstance() => create(); + static $pb.PbList createRepeated() => $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Datum getDefault() => + _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor(create); + static Datum? _defaultInstance; + + @$pb.TagNumber(1) + Datum_DatumType get type => $_getN(0); + @$pb.TagNumber(1) + set type(Datum_DatumType v) { + setField(1, v); + } + + @$pb.TagNumber(1) + $core.bool hasType() => $_has(0); + @$pb.TagNumber(1) + void clearType() => clearField(1); + + @$pb.TagNumber(2) + $core.bool get rBool => $_getBF(1); + @$pb.TagNumber(2) + set rBool($core.bool v) { + $_setBool(1, v); + } + + @$pb.TagNumber(2) + $core.bool hasRBool() => $_has(1); + @$pb.TagNumber(2) + void clearRBool() => clearField(2); + + @$pb.TagNumber(3) + $core.double get rNum => $_getN(2); + @$pb.TagNumber(3) + set rNum($core.double v) { + $_setDouble(2, v); + } + + @$pb.TagNumber(3) + $core.bool hasRNum() => $_has(2); + @$pb.TagNumber(3) + void clearRNum() => clearField(3); + + @$pb.TagNumber(4) + $core.String get rStr => $_getSZ(3); + @$pb.TagNumber(4) + set rStr($core.String v) { + $_setString(3, v); + } + + @$pb.TagNumber(4) + $core.bool hasRStr() => $_has(3); + @$pb.TagNumber(4) + void clearRStr() => clearField(4); + + @$pb.TagNumber(5) + $core.List get rArray => $_getList(4); + + @$pb.TagNumber(6) + $core.List get rObject => $_getList(5); +} + +class Term_AssocPair extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Term.AssocPair', + createEmptyInstance: create) + ..aOS( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'key') + ..aOM( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'val', + subBuilder: Term.create) + ..hasRequiredFields = false; + + Term_AssocPair._() : super(); + factory Term_AssocPair({ + $core.String? key, + Term? val, + }) { + final result = create(); + if (key != null) { + result.key = key; + } + if (val != null) { + result.val = val; + } + return result; + } + factory Term_AssocPair.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Term_AssocPair.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Term_AssocPair clone() => Term_AssocPair()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Term_AssocPair copyWith(void Function(Term_AssocPair) updates) => + super.copyWith((message) => updates(message as Term_AssocPair)) + as Term_AssocPair; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Term_AssocPair create() => Term_AssocPair._(); + Term_AssocPair createEmptyInstance() => create(); + static $pb.PbList createRepeated() => + $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Term_AssocPair getDefault() => _defaultInstance ??= + $pb.GeneratedMessage.$_defaultFor(create); + static Term_AssocPair? _defaultInstance; + + @$pb.TagNumber(1) + $core.String get key => $_getSZ(0); + @$pb.TagNumber(1) + set key($core.String v) { + $_setString(0, v); + } + + @$pb.TagNumber(1) + $core.bool hasKey() => $_has(0); + @$pb.TagNumber(1) + void clearKey() => clearField(1); + + @$pb.TagNumber(2) + Term get val => $_getN(1); + @$pb.TagNumber(2) + set val(Term v) { + setField(2, v); + } + + @$pb.TagNumber(2) + $core.bool hasVal() => $_has(1); + @$pb.TagNumber(2) + void clearVal() => clearField(2); + @$pb.TagNumber(2) + Term ensureVal() => $_ensure(1); +} + +class Term extends $pb.GeneratedMessage { + static final $pb.BuilderInfo _i = $pb.BuilderInfo( + const $core.bool.fromEnvironment('protobuf.omit_message_names') + ? '' + : 'Term', + createEmptyInstance: create) + ..e( + 1, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'type', + $pb.PbFieldType.OE, + defaultOrMaker: Term_TermType.DATUM, + valueOf: Term_TermType.valueOf, + enumValues: Term_TermType.values) + ..aOM( + 2, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'datum', + subBuilder: Datum.create) + ..pc( + 3, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'args', + $pb.PbFieldType.PM, + subBuilder: Term.create) + ..pc( + 4, + const $core.bool.fromEnvironment('protobuf.omit_field_names') + ? '' + : 'optargs', + $pb.PbFieldType.PM, + subBuilder: Term_AssocPair.create) + ..hasRequiredFields = false; + + Term._() : super(); + factory Term({ + Term_TermType? type, + Datum? datum, + $core.Iterable? args, + $core.Iterable? optargs, + }) { + final result = create(); + if (type != null) { + result.type = type; + } + if (datum != null) { + result.datum = datum; + } + if (args != null) { + result.args.addAll(args); + } + if (optargs != null) { + result.optargs.addAll(optargs); + } + return result; + } + factory Term.fromBuffer($core.List<$core.int> i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromBuffer(i, r); + factory Term.fromJson($core.String i, + [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => + create()..mergeFromJson(i, r); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.deepCopy] instead. ' + 'Will be removed in next major version') + Term clone() => Term()..mergeFromMessage(this); + @$core.Deprecated('Using this can add significant overhead to your binary. ' + 'Use [GeneratedMessageGenericExtensions.rebuild] instead. ' + 'Will be removed in next major version') + Term copyWith(void Function(Term) updates) => + super.copyWith((message) => updates(message as Term)) + as Term; // ignore: deprecated_member_use + $pb.BuilderInfo get info_ => _i; + @$core.pragma('dart2js:noInline') + static Term create() => Term._(); + Term createEmptyInstance() => create(); + static $pb.PbList createRepeated() => $pb.PbList(); + @$core.pragma('dart2js:noInline') + static Term getDefault() => + _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor(create); + static Term? _defaultInstance; + + @$pb.TagNumber(1) + Term_TermType get type => $_getN(0); + @$pb.TagNumber(1) + set type(Term_TermType v) { + setField(1, v); + } + + @$pb.TagNumber(1) + $core.bool hasType() => $_has(0); + @$pb.TagNumber(1) + void clearType() => clearField(1); + + @$pb.TagNumber(2) + Datum get datum => $_getN(1); + @$pb.TagNumber(2) + set datum(Datum v) { + setField(2, v); + } + + @$pb.TagNumber(2) + $core.bool hasDatum() => $_has(1); + @$pb.TagNumber(2) + void clearDatum() => clearField(2); + @$pb.TagNumber(2) + Datum ensureDatum() => $_ensure(1); + + @$pb.TagNumber(3) + $core.List get args => $_getList(2); + + @$pb.TagNumber(4) + $core.List get optargs => $_getList(3); +} diff --git a/drivers/rethinkdb/lib/src/generated/ql2.pbenum.dart b/drivers/rethinkdb/lib/src/generated/ql2.pbenum.dart new file mode 100644 index 0000000..9705f5a --- /dev/null +++ b/drivers/rethinkdb/lib/src/generated/ql2.pbenum.dart @@ -0,0 +1,1019 @@ +// Generated code. Do not modify. +// source: ql2.proto +// +// @dart = 2.12 +// ignore_for_file: annotate_overrides,camel_case_types,unnecessary_const,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type,unnecessary_this,prefer_final_fields + +// ignore_for_file: UNDEFINED_SHOWN_NAME +import 'dart:core' as $core; +import 'package:protobuf/protobuf.dart' as $pb; + +class VersionDummy_Version extends $pb.ProtobufEnum { + static const VersionDummy_Version V0_1 = VersionDummy_Version._(1063369270, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'V0_1'); + static const VersionDummy_Version V0_2 = VersionDummy_Version._(1915781601, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'V0_2'); + static const VersionDummy_Version V0_3 = VersionDummy_Version._(1601562686, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'V0_3'); + static const VersionDummy_Version V0_4 = VersionDummy_Version._(1074539808, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'V0_4'); + static const VersionDummy_Version V1_0 = VersionDummy_Version._(885177795, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'V1_0'); + + static const $core.List values = [ + V0_1, + V0_2, + V0_3, + V0_4, + V1_0, + ]; + + static final $core.Map<$core.int, VersionDummy_Version> _byValue = + $pb.ProtobufEnum.initByValue(values); + static VersionDummy_Version? valueOf($core.int value) => _byValue[value]; + + const VersionDummy_Version._($core.int v, $core.String n) : super(v, n); +} + +class VersionDummy_Protocol extends $pb.ProtobufEnum { + static const VersionDummy_Protocol PROTOBUF = VersionDummy_Protocol._( + 656407617, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'PROTOBUF'); + static const VersionDummy_Protocol JSON = VersionDummy_Protocol._(2120839367, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'JSON'); + + static const $core.List values = + [ + PROTOBUF, + JSON, + ]; + + static final $core.Map<$core.int, VersionDummy_Protocol> _byValue = + $pb.ProtobufEnum.initByValue(values); + static VersionDummy_Protocol? valueOf($core.int value) => _byValue[value]; + + const VersionDummy_Protocol._($core.int v, $core.String n) : super(v, n); +} + +class Query_QueryType extends $pb.ProtobufEnum { + static const Query_QueryType START = Query_QueryType._( + 1, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'START'); + static const Query_QueryType CONTINUE = Query_QueryType._(2, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'CONTINUE'); + static const Query_QueryType STOP = Query_QueryType._( + 3, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'STOP'); + static const Query_QueryType NOREPLY_WAIT = Query_QueryType._( + 4, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'NOREPLY_WAIT'); + static const Query_QueryType SERVER_INFO = Query_QueryType._( + 5, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SERVER_INFO'); + + static const $core.List values = [ + START, + CONTINUE, + STOP, + NOREPLY_WAIT, + SERVER_INFO, + ]; + + static final $core.Map<$core.int, Query_QueryType> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Query_QueryType? valueOf($core.int value) => _byValue[value]; + + const Query_QueryType._($core.int v, $core.String n) : super(v, n); +} + +class Frame_FrameType extends $pb.ProtobufEnum { + static const Frame_FrameType POS = Frame_FrameType._( + 1, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'POS'); + static const Frame_FrameType OPT = Frame_FrameType._( + 2, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'OPT'); + + static const $core.List values = [ + POS, + OPT, + ]; + + static final $core.Map<$core.int, Frame_FrameType> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Frame_FrameType? valueOf($core.int value) => _byValue[value]; + + const Frame_FrameType._($core.int v, $core.String n) : super(v, n); +} + +class Response_ResponseType extends $pb.ProtobufEnum { + static const Response_ResponseType SUCCESS_ATOM = Response_ResponseType._( + 1, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SUCCESS_ATOM'); + static const Response_ResponseType SUCCESS_SEQUENCE = Response_ResponseType._( + 2, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SUCCESS_SEQUENCE'); + static const Response_ResponseType SUCCESS_PARTIAL = Response_ResponseType._( + 3, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SUCCESS_PARTIAL'); + static const Response_ResponseType WAIT_COMPLETE = Response_ResponseType._( + 4, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'WAIT_COMPLETE'); + static const Response_ResponseType SERVER_INFO = Response_ResponseType._( + 5, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SERVER_INFO'); + static const Response_ResponseType CLIENT_ERROR = Response_ResponseType._( + 16, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'CLIENT_ERROR'); + static const Response_ResponseType COMPILE_ERROR = Response_ResponseType._( + 17, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'COMPILE_ERROR'); + static const Response_ResponseType RUNTIME_ERROR = Response_ResponseType._( + 18, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'RUNTIME_ERROR'); + + static const $core.List values = + [ + SUCCESS_ATOM, + SUCCESS_SEQUENCE, + SUCCESS_PARTIAL, + WAIT_COMPLETE, + SERVER_INFO, + CLIENT_ERROR, + COMPILE_ERROR, + RUNTIME_ERROR, + ]; + + static final $core.Map<$core.int, Response_ResponseType> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Response_ResponseType? valueOf($core.int value) => _byValue[value]; + + const Response_ResponseType._($core.int v, $core.String n) : super(v, n); +} + +class Response_ErrorType extends $pb.ProtobufEnum { + static const Response_ErrorType INTERNAL = Response_ErrorType._(1000000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'INTERNAL'); + static const Response_ErrorType RESOURCE_LIMIT = Response_ErrorType._( + 2000000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'RESOURCE_LIMIT'); + static const Response_ErrorType QUERY_LOGIC = Response_ErrorType._( + 3000000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'QUERY_LOGIC'); + static const Response_ErrorType NON_EXISTENCE = Response_ErrorType._( + 3100000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'NON_EXISTENCE'); + static const Response_ErrorType OP_FAILED = Response_ErrorType._( + 4100000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'OP_FAILED'); + static const Response_ErrorType OP_INDETERMINATE = Response_ErrorType._( + 4200000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'OP_INDETERMINATE'); + static const Response_ErrorType USER = Response_ErrorType._(5000000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'USER'); + static const Response_ErrorType PERMISSION_ERROR = Response_ErrorType._( + 6000000, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'PERMISSION_ERROR'); + + static const $core.List values = [ + INTERNAL, + RESOURCE_LIMIT, + QUERY_LOGIC, + NON_EXISTENCE, + OP_FAILED, + OP_INDETERMINATE, + USER, + PERMISSION_ERROR, + ]; + + static final $core.Map<$core.int, Response_ErrorType> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Response_ErrorType? valueOf($core.int value) => _byValue[value]; + + const Response_ErrorType._($core.int v, $core.String n) : super(v, n); +} + +class Response_ResponseNote extends $pb.ProtobufEnum { + static const Response_ResponseNote SEQUENCE_FEED = Response_ResponseNote._( + 1, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SEQUENCE_FEED'); + static const Response_ResponseNote ATOM_FEED = Response_ResponseNote._( + 2, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'ATOM_FEED'); + static const Response_ResponseNote ORDER_BY_LIMIT_FEED = + Response_ResponseNote._( + 3, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'ORDER_BY_LIMIT_FEED'); + static const Response_ResponseNote UNIONED_FEED = Response_ResponseNote._( + 4, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'UNIONED_FEED'); + static const Response_ResponseNote INCLUDES_STATES = Response_ResponseNote._( + 5, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INCLUDES_STATES'); + + static const $core.List values = + [ + SEQUENCE_FEED, + ATOM_FEED, + ORDER_BY_LIMIT_FEED, + UNIONED_FEED, + INCLUDES_STATES, + ]; + + static final $core.Map<$core.int, Response_ResponseNote> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Response_ResponseNote? valueOf($core.int value) => _byValue[value]; + + const Response_ResponseNote._($core.int v, $core.String n) : super(v, n); +} + +class Datum_DatumType extends $pb.ProtobufEnum { + static const Datum_DatumType R_NULL = Datum_DatumType._(1, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_NULL'); + static const Datum_DatumType R_BOOL = Datum_DatumType._(2, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_BOOL'); + static const Datum_DatumType R_NUM = Datum_DatumType._( + 3, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_NUM'); + static const Datum_DatumType R_STR = Datum_DatumType._( + 4, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_STR'); + static const Datum_DatumType R_ARRAY = Datum_DatumType._(5, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_ARRAY'); + static const Datum_DatumType R_OBJECT = Datum_DatumType._(6, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_OBJECT'); + static const Datum_DatumType R_JSON = Datum_DatumType._(7, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'R_JSON'); + + static const $core.List values = [ + R_NULL, + R_BOOL, + R_NUM, + R_STR, + R_ARRAY, + R_OBJECT, + R_JSON, + ]; + + static final $core.Map<$core.int, Datum_DatumType> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Datum_DatumType? valueOf($core.int value) => _byValue[value]; + + const Datum_DatumType._($core.int v, $core.String n) : super(v, n); +} + +class Term_TermType extends $pb.ProtobufEnum { + static const Term_TermType DATUM = Term_TermType._( + 1, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DATUM'); + static const Term_TermType MAKE_ARRAY = Term_TermType._( + 2, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'MAKE_ARRAY'); + static const Term_TermType MAKE_OBJ = Term_TermType._(3, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MAKE_OBJ'); + static const Term_TermType VAR = Term_TermType._( + 10, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'VAR'); + static const Term_TermType JAVASCRIPT = Term_TermType._( + 11, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'JAVASCRIPT'); + static const Term_TermType UUID = Term_TermType._(169, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'UUID'); + static const Term_TermType HTTP = Term_TermType._(153, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'HTTP'); + static const Term_TermType ERROR = Term_TermType._(12, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ERROR'); + static const Term_TermType IMPLICIT_VAR = Term_TermType._( + 13, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'IMPLICIT_VAR'); + static const Term_TermType DB = Term_TermType._( + 14, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DB'); + static const Term_TermType TABLE = Term_TermType._(15, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'TABLE'); + static const Term_TermType GET = Term_TermType._( + 16, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GET'); + static const Term_TermType GET_ALL = Term_TermType._(78, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GET_ALL'); + static const Term_TermType EQ = Term_TermType._( + 17, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'EQ'); + static const Term_TermType NE = Term_TermType._( + 18, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'NE'); + static const Term_TermType LT = Term_TermType._( + 19, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'LT'); + static const Term_TermType LE = Term_TermType._( + 20, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'LE'); + static const Term_TermType GT = Term_TermType._( + 21, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GT'); + static const Term_TermType GE = Term_TermType._( + 22, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GE'); + static const Term_TermType NOT = Term_TermType._( + 23, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'NOT'); + static const Term_TermType ADD = Term_TermType._( + 24, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ADD'); + static const Term_TermType SUB = Term_TermType._( + 25, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SUB'); + static const Term_TermType MUL = Term_TermType._( + 26, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MUL'); + static const Term_TermType DIV = Term_TermType._( + 27, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DIV'); + static const Term_TermType MOD = Term_TermType._( + 28, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MOD'); + static const Term_TermType FLOOR = Term_TermType._(183, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FLOOR'); + static const Term_TermType CEIL = Term_TermType._(184, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'CEIL'); + static const Term_TermType ROUND = Term_TermType._(185, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ROUND'); + static const Term_TermType APPEND = Term_TermType._(29, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'APPEND'); + static const Term_TermType PREPEND = Term_TermType._(80, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'PREPEND'); + static const Term_TermType DIFFERENCE = Term_TermType._( + 95, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'DIFFERENCE'); + static const Term_TermType SET_INSERT = Term_TermType._( + 88, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SET_INSERT'); + static const Term_TermType SET_INTERSECTION = Term_TermType._( + 89, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SET_INTERSECTION'); + static const Term_TermType SET_UNION = Term_TermType._( + 90, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SET_UNION'); + static const Term_TermType SET_DIFFERENCE = Term_TermType._( + 91, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SET_DIFFERENCE'); + static const Term_TermType SLICE = Term_TermType._(30, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SLICE'); + static const Term_TermType SKIP = Term_TermType._( + 70, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SKIP'); + static const Term_TermType LIMIT = Term_TermType._(71, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'LIMIT'); + static const Term_TermType OFFSETS_OF = Term_TermType._( + 87, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'OFFSETS_OF'); + static const Term_TermType CONTAINS = Term_TermType._(93, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'CONTAINS'); + static const Term_TermType GET_FIELD = Term_TermType._( + 31, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'GET_FIELD'); + static const Term_TermType KEYS = Term_TermType._( + 94, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'KEYS'); + static const Term_TermType VALUES = Term_TermType._(186, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'VALUES'); + static const Term_TermType OBJECT = Term_TermType._(143, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'OBJECT'); + static const Term_TermType HAS_FIELDS = Term_TermType._( + 32, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'HAS_FIELDS'); + static const Term_TermType WITH_FIELDS = Term_TermType._( + 96, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'WITH_FIELDS'); + static const Term_TermType PLUCK = Term_TermType._(33, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'PLUCK'); + static const Term_TermType WITHOUT = Term_TermType._(34, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'WITHOUT'); + static const Term_TermType MERGE = Term_TermType._(35, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MERGE'); + static const Term_TermType BETWEEN_DEPRECATED = Term_TermType._( + 36, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'BETWEEN_DEPRECATED'); + static const Term_TermType BETWEEN = Term_TermType._(182, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BETWEEN'); + static const Term_TermType REDUCE = Term_TermType._(37, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'REDUCE'); + static const Term_TermType MAP = Term_TermType._( + 38, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MAP'); + static const Term_TermType FOLD = Term_TermType._(187, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FOLD'); + static const Term_TermType FILTER = Term_TermType._(39, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FILTER'); + static const Term_TermType CONCAT_MAP = Term_TermType._( + 40, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'CONCAT_MAP'); + static const Term_TermType ORDER_BY = Term_TermType._(41, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ORDER_BY'); + static const Term_TermType DISTINCT = Term_TermType._(42, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DISTINCT'); + static const Term_TermType COUNT = Term_TermType._(43, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'COUNT'); + static const Term_TermType IS_EMPTY = Term_TermType._(86, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'IS_EMPTY'); + static const Term_TermType UNION = Term_TermType._(44, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'UNION'); + static const Term_TermType NTH = Term_TermType._( + 45, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'NTH'); + static const Term_TermType BRACKET = Term_TermType._(170, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BRACKET'); + static const Term_TermType INNER_JOIN = Term_TermType._( + 48, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INNER_JOIN'); + static const Term_TermType OUTER_JOIN = Term_TermType._( + 49, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'OUTER_JOIN'); + static const Term_TermType EQ_JOIN = Term_TermType._(50, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'EQ_JOIN'); + static const Term_TermType ZIP = Term_TermType._( + 72, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ZIP'); + static const Term_TermType RANGE = Term_TermType._(173, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'RANGE'); + static const Term_TermType INSERT_AT = Term_TermType._( + 82, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INSERT_AT'); + static const Term_TermType DELETE_AT = Term_TermType._( + 83, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'DELETE_AT'); + static const Term_TermType CHANGE_AT = Term_TermType._( + 84, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'CHANGE_AT'); + static const Term_TermType SPLICE_AT = Term_TermType._( + 85, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SPLICE_AT'); + static const Term_TermType COERCE_TO = Term_TermType._( + 51, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'COERCE_TO'); + static const Term_TermType TYPE_OF = Term_TermType._(52, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'TYPE_OF'); + static const Term_TermType UPDATE = Term_TermType._(53, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'UPDATE'); + static const Term_TermType DELETE = Term_TermType._(54, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DELETE'); + static const Term_TermType REPLACE = Term_TermType._(55, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'REPLACE'); + static const Term_TermType INSERT = Term_TermType._(56, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'INSERT'); + static const Term_TermType DB_CREATE = Term_TermType._( + 57, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'DB_CREATE'); + static const Term_TermType DB_DROP = Term_TermType._(58, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DB_DROP'); + static const Term_TermType DB_LIST = Term_TermType._(59, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DB_LIST'); + static const Term_TermType TABLE_CREATE = Term_TermType._( + 60, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TABLE_CREATE'); + static const Term_TermType TABLE_DROP = Term_TermType._( + 61, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TABLE_DROP'); + static const Term_TermType TABLE_LIST = Term_TermType._( + 62, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TABLE_LIST'); + static const Term_TermType CONFIG = Term_TermType._(174, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'CONFIG'); + static const Term_TermType STATUS = Term_TermType._(175, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'STATUS'); + static const Term_TermType WAIT = Term_TermType._(177, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'WAIT'); + static const Term_TermType RECONFIGURE = Term_TermType._( + 176, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'RECONFIGURE'); + static const Term_TermType REBALANCE = Term_TermType._( + 179, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'REBALANCE'); + static const Term_TermType SYNC = Term_TermType._(138, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SYNC'); + static const Term_TermType GRANT = Term_TermType._(188, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GRANT'); + static const Term_TermType INDEX_CREATE = Term_TermType._( + 75, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INDEX_CREATE'); + static const Term_TermType INDEX_DROP = Term_TermType._( + 76, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INDEX_DROP'); + static const Term_TermType INDEX_LIST = Term_TermType._( + 77, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INDEX_LIST'); + static const Term_TermType INDEX_STATUS = Term_TermType._( + 139, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INDEX_STATUS'); + static const Term_TermType INDEX_WAIT = Term_TermType._( + 140, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INDEX_WAIT'); + static const Term_TermType INDEX_RENAME = Term_TermType._( + 156, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INDEX_RENAME'); + static const Term_TermType SET_WRITE_HOOK = Term_TermType._( + 189, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SET_WRITE_HOOK'); + static const Term_TermType GET_WRITE_HOOK = Term_TermType._( + 190, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'GET_WRITE_HOOK'); + static const Term_TermType FUNCALL = Term_TermType._(64, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FUNCALL'); + static const Term_TermType BRANCH = Term_TermType._(65, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BRANCH'); + static const Term_TermType OR = Term_TermType._( + 66, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'OR'); + static const Term_TermType AND = Term_TermType._( + 67, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'AND'); + static const Term_TermType FOR_EACH = Term_TermType._(68, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FOR_EACH'); + static const Term_TermType FUNC = Term_TermType._( + 69, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FUNC'); + static const Term_TermType ASC = Term_TermType._( + 73, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ASC'); + static const Term_TermType DESC = Term_TermType._( + 74, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DESC'); + static const Term_TermType INFO = Term_TermType._( + 79, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'INFO'); + static const Term_TermType MATCH = Term_TermType._(97, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MATCH'); + static const Term_TermType UPCASE = Term_TermType._(141, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'UPCASE'); + static const Term_TermType DOWNCASE = Term_TermType._(142, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DOWNCASE'); + static const Term_TermType SAMPLE = Term_TermType._(81, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SAMPLE'); + static const Term_TermType DEFAULT = Term_TermType._(92, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DEFAULT'); + static const Term_TermType JSON = Term_TermType._( + 98, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'JSON'); + static const Term_TermType ISO8601 = Term_TermType._(99, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ISO8601'); + static const Term_TermType TO_ISO8601 = Term_TermType._( + 100, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TO_ISO8601'); + static const Term_TermType EPOCH_TIME = Term_TermType._( + 101, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'EPOCH_TIME'); + static const Term_TermType TO_EPOCH_TIME = Term_TermType._( + 102, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TO_EPOCH_TIME'); + static const Term_TermType NOW = Term_TermType._( + 103, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'NOW'); + static const Term_TermType IN_TIMEZONE = Term_TermType._( + 104, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'IN_TIMEZONE'); + static const Term_TermType DURING = Term_TermType._(105, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DURING'); + static const Term_TermType DATE = Term_TermType._(106, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DATE'); + static const Term_TermType TIME_OF_DAY = Term_TermType._( + 126, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TIME_OF_DAY'); + static const Term_TermType TIMEZONE = Term_TermType._(127, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'TIMEZONE'); + static const Term_TermType YEAR = Term_TermType._(128, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'YEAR'); + static const Term_TermType MONTH = Term_TermType._(129, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MONTH'); + static const Term_TermType DAY = Term_TermType._( + 130, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DAY'); + static const Term_TermType DAY_OF_WEEK = Term_TermType._( + 131, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'DAY_OF_WEEK'); + static const Term_TermType DAY_OF_YEAR = Term_TermType._( + 132, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'DAY_OF_YEAR'); + static const Term_TermType HOURS = Term_TermType._(133, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'HOURS'); + static const Term_TermType MINUTES = Term_TermType._(134, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MINUTES'); + static const Term_TermType SECONDS = Term_TermType._(135, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SECONDS'); + static const Term_TermType TIME = Term_TermType._(136, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'TIME'); + static const Term_TermType MONDAY = Term_TermType._(107, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MONDAY'); + static const Term_TermType TUESDAY = Term_TermType._(108, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'TUESDAY'); + static const Term_TermType WEDNESDAY = Term_TermType._( + 109, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'WEDNESDAY'); + static const Term_TermType THURSDAY = Term_TermType._(110, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'THURSDAY'); + static const Term_TermType FRIDAY = Term_TermType._(111, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FRIDAY'); + static const Term_TermType SATURDAY = Term_TermType._(112, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SATURDAY'); + static const Term_TermType SUNDAY = Term_TermType._(113, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SUNDAY'); + static const Term_TermType JANUARY = Term_TermType._(114, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'JANUARY'); + static const Term_TermType FEBRUARY = Term_TermType._(115, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FEBRUARY'); + static const Term_TermType MARCH = Term_TermType._(116, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MARCH'); + static const Term_TermType APRIL = Term_TermType._(117, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'APRIL'); + static const Term_TermType MAY = Term_TermType._( + 118, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MAY'); + static const Term_TermType JUNE = Term_TermType._(119, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'JUNE'); + static const Term_TermType JULY = Term_TermType._(120, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'JULY'); + static const Term_TermType AUGUST = Term_TermType._(121, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'AUGUST'); + static const Term_TermType SEPTEMBER = Term_TermType._( + 122, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'SEPTEMBER'); + static const Term_TermType OCTOBER = Term_TermType._(123, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'OCTOBER'); + static const Term_TermType NOVEMBER = Term_TermType._(124, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'NOVEMBER'); + static const Term_TermType DECEMBER = Term_TermType._(125, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DECEMBER'); + static const Term_TermType LITERAL = Term_TermType._(137, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'LITERAL'); + static const Term_TermType GROUP = Term_TermType._(144, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GROUP'); + static const Term_TermType SUM = Term_TermType._( + 145, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SUM'); + static const Term_TermType AVG = Term_TermType._( + 146, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'AVG'); + static const Term_TermType MIN = Term_TermType._( + 147, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MIN'); + static const Term_TermType MAX = Term_TermType._( + 148, $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MAX'); + static const Term_TermType SPLIT = Term_TermType._(149, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'SPLIT'); + static const Term_TermType UNGROUP = Term_TermType._(150, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'UNGROUP'); + static const Term_TermType RANDOM = Term_TermType._(151, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'RANDOM'); + static const Term_TermType CHANGES = Term_TermType._(152, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'CHANGES'); + static const Term_TermType ARGS = Term_TermType._(154, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'ARGS'); + static const Term_TermType BINARY = Term_TermType._(155, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BINARY'); + static const Term_TermType GEOJSON = Term_TermType._(157, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'GEOJSON'); + static const Term_TermType TO_GEOJSON = Term_TermType._( + 158, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TO_GEOJSON'); + static const Term_TermType POINT = Term_TermType._(159, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'POINT'); + static const Term_TermType LINE = Term_TermType._(160, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'LINE'); + static const Term_TermType POLYGON = Term_TermType._(161, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'POLYGON'); + static const Term_TermType DISTANCE = Term_TermType._(162, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'DISTANCE'); + static const Term_TermType INTERSECTS = Term_TermType._( + 163, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'INTERSECTS'); + static const Term_TermType INCLUDES = Term_TermType._(164, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'INCLUDES'); + static const Term_TermType CIRCLE = Term_TermType._(165, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'CIRCLE'); + static const Term_TermType GET_INTERSECTING = Term_TermType._( + 166, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'GET_INTERSECTING'); + static const Term_TermType FILL = Term_TermType._(167, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'FILL'); + static const Term_TermType GET_NEAREST = Term_TermType._( + 168, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'GET_NEAREST'); + static const Term_TermType POLYGON_SUB = Term_TermType._( + 171, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'POLYGON_SUB'); + static const Term_TermType TO_JSON_STRING = Term_TermType._( + 172, + $core.bool.fromEnvironment('protobuf.omit_enum_names') + ? '' + : 'TO_JSON_STRING'); + static const Term_TermType MINVAL = Term_TermType._(180, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MINVAL'); + static const Term_TermType MAXVAL = Term_TermType._(181, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'MAXVAL'); + static const Term_TermType BIT_AND = Term_TermType._(191, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BIT_AND'); + static const Term_TermType BIT_OR = Term_TermType._(192, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BIT_OR'); + static const Term_TermType BIT_XOR = Term_TermType._(193, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BIT_XOR'); + static const Term_TermType BIT_NOT = Term_TermType._(194, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BIT_NOT'); + static const Term_TermType BIT_SAL = Term_TermType._(195, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BIT_SAL'); + static const Term_TermType BIT_SAR = Term_TermType._(196, + $core.bool.fromEnvironment('protobuf.omit_enum_names') ? '' : 'BIT_SAR'); + + static const $core.List values = [ + DATUM, + MAKE_ARRAY, + MAKE_OBJ, + VAR, + JAVASCRIPT, + UUID, + HTTP, + ERROR, + IMPLICIT_VAR, + DB, + TABLE, + GET, + GET_ALL, + EQ, + NE, + LT, + LE, + GT, + GE, + NOT, + ADD, + SUB, + MUL, + DIV, + MOD, + FLOOR, + CEIL, + ROUND, + APPEND, + PREPEND, + DIFFERENCE, + SET_INSERT, + SET_INTERSECTION, + SET_UNION, + SET_DIFFERENCE, + SLICE, + SKIP, + LIMIT, + OFFSETS_OF, + CONTAINS, + GET_FIELD, + KEYS, + VALUES, + OBJECT, + HAS_FIELDS, + WITH_FIELDS, + PLUCK, + WITHOUT, + MERGE, + BETWEEN_DEPRECATED, + BETWEEN, + REDUCE, + MAP, + FOLD, + FILTER, + CONCAT_MAP, + ORDER_BY, + DISTINCT, + COUNT, + IS_EMPTY, + UNION, + NTH, + BRACKET, + INNER_JOIN, + OUTER_JOIN, + EQ_JOIN, + ZIP, + RANGE, + INSERT_AT, + DELETE_AT, + CHANGE_AT, + SPLICE_AT, + COERCE_TO, + TYPE_OF, + UPDATE, + DELETE, + REPLACE, + INSERT, + DB_CREATE, + DB_DROP, + DB_LIST, + TABLE_CREATE, + TABLE_DROP, + TABLE_LIST, + CONFIG, + STATUS, + WAIT, + RECONFIGURE, + REBALANCE, + SYNC, + GRANT, + INDEX_CREATE, + INDEX_DROP, + INDEX_LIST, + INDEX_STATUS, + INDEX_WAIT, + INDEX_RENAME, + SET_WRITE_HOOK, + GET_WRITE_HOOK, + FUNCALL, + BRANCH, + OR, + AND, + FOR_EACH, + FUNC, + ASC, + DESC, + INFO, + MATCH, + UPCASE, + DOWNCASE, + SAMPLE, + DEFAULT, + JSON, + ISO8601, + TO_ISO8601, + EPOCH_TIME, + TO_EPOCH_TIME, + NOW, + IN_TIMEZONE, + DURING, + DATE, + TIME_OF_DAY, + TIMEZONE, + YEAR, + MONTH, + DAY, + DAY_OF_WEEK, + DAY_OF_YEAR, + HOURS, + MINUTES, + SECONDS, + TIME, + MONDAY, + TUESDAY, + WEDNESDAY, + THURSDAY, + FRIDAY, + SATURDAY, + SUNDAY, + JANUARY, + FEBRUARY, + MARCH, + APRIL, + MAY, + JUNE, + JULY, + AUGUST, + SEPTEMBER, + OCTOBER, + NOVEMBER, + DECEMBER, + LITERAL, + GROUP, + SUM, + AVG, + MIN, + MAX, + SPLIT, + UNGROUP, + RANDOM, + CHANGES, + ARGS, + BINARY, + GEOJSON, + TO_GEOJSON, + POINT, + LINE, + POLYGON, + DISTANCE, + INTERSECTS, + INCLUDES, + CIRCLE, + GET_INTERSECTING, + FILL, + GET_NEAREST, + POLYGON_SUB, + TO_JSON_STRING, + MINVAL, + MAXVAL, + BIT_AND, + BIT_OR, + BIT_XOR, + BIT_NOT, + BIT_SAL, + BIT_SAR, + ]; + + static final $core.Map<$core.int, Term_TermType> _byValue = + $pb.ProtobufEnum.initByValue(values); + static Term_TermType? valueOf($core.int value) => _byValue[value]; + + const Term_TermType._($core.int v, $core.String n) : super(v, n); +} diff --git a/drivers/rethinkdb/lib/src/generated/ql2.pbjson.dart b/drivers/rethinkdb/lib/src/generated/ql2.pbjson.dart new file mode 100644 index 0000000..d63d3e1 --- /dev/null +++ b/drivers/rethinkdb/lib/src/generated/ql2.pbjson.dart @@ -0,0 +1,520 @@ +// Generated code. Do not modify. +// source: ql2.proto +// +// @dart = 2.12 +// ignore_for_file: annotate_overrides,camel_case_types,unnecessary_const,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type,unnecessary_this,prefer_final_fields,deprecated_member_use_from_same_package + +import 'dart:core' as $core; +import 'dart:convert' as $convert; +import 'dart:typed_data' as $typed_data; + +@$core.Deprecated('Use versionDummyDescriptor instead') +const VersionDummy$json = { + '1': 'VersionDummy', + '4': [VersionDummy_Version$json, VersionDummy_Protocol$json], +}; + +@$core.Deprecated('Use versionDummyDescriptor instead') +const VersionDummy_Version$json = { + '1': 'Version', + '2': [ + {'1': 'V0_1', '2': 1063369270}, + {'1': 'V0_2', '2': 1915781601}, + {'1': 'V0_3', '2': 1601562686}, + {'1': 'V0_4', '2': 1074539808}, + {'1': 'V1_0', '2': 885177795}, + ], +}; + +@$core.Deprecated('Use versionDummyDescriptor instead') +const VersionDummy_Protocol$json = { + '1': 'Protocol', + '2': [ + {'1': 'PROTOBUF', '2': 656407617}, + {'1': 'JSON', '2': 2120839367}, + ], +}; + +/// Descriptor for `VersionDummy`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List versionDummyDescriptor = $convert.base64Decode( + 'CgxWZXJzaW9uRHVtbXkiTwoHVmVyc2lvbhIMCgRWMF8xELb0hvsDEgwKBFYwXzIQ4YPCkQcSDAoEVjBfMxC+0Nf7BRIMCgRWMF80EKDasIAEEgwKBFYxXzAQw/uKpgMiKgoIUHJvdG9jb2wSEAoIUFJPVE9CVUYQwfj/uAISDAoESlNPThDH4aXzBw=='); +@$core.Deprecated('Use queryDescriptor instead') +const Query$json = { + '1': 'Query', + '2': [ + { + '1': 'type', + '3': 1, + '4': 1, + '5': 14, + '6': '.Query.QueryType', + '10': 'type' + }, + {'1': 'query', '3': 2, '4': 1, '5': 11, '6': '.Term', '10': 'query'}, + {'1': 'token', '3': 3, '4': 1, '5': 3, '10': 'token'}, + { + '1': 'OBSOLETE_noreply', + '3': 4, + '4': 1, + '5': 8, + '7': 'false', + '10': 'OBSOLETENoreply' + }, + { + '1': 'accepts_r_json', + '3': 5, + '4': 1, + '5': 8, + '7': 'false', + '10': 'acceptsRJson' + }, + { + '1': 'global_optargs', + '3': 6, + '4': 3, + '5': 11, + '6': '.Query.AssocPair', + '10': 'globalOptargs' + }, + ], + '3': [Query_AssocPair$json], + '4': [Query_QueryType$json], +}; + +@$core.Deprecated('Use queryDescriptor instead') +const Query_AssocPair$json = { + '1': 'AssocPair', + '2': [ + {'1': 'key', '3': 1, '4': 1, '5': 9, '10': 'key'}, + {'1': 'val', '3': 2, '4': 1, '5': 11, '6': '.Term', '10': 'val'}, + ], +}; + +@$core.Deprecated('Use queryDescriptor instead') +const Query_QueryType$json = { + '1': 'QueryType', + '2': [ + {'1': 'START', '2': 1}, + {'1': 'CONTINUE', '2': 2}, + {'1': 'STOP', '2': 3}, + {'1': 'NOREPLY_WAIT', '2': 4}, + {'1': 'SERVER_INFO', '2': 5}, + ], +}; + +/// Descriptor for `Query`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List queryDescriptor = $convert.base64Decode( + 'CgVRdWVyeRIkCgR0eXBlGAEgASgOMhAuUXVlcnkuUXVlcnlUeXBlUgR0eXBlEhsKBXF1ZXJ5GAIgASgLMgUuVGVybVIFcXVlcnkSFAoFdG9rZW4YAyABKANSBXRva2VuEjAKEE9CU09MRVRFX25vcmVwbHkYBCABKAg6BWZhbHNlUg9PQlNPTEVURU5vcmVwbHkSKwoOYWNjZXB0c19yX2pzb24YBSABKAg6BWZhbHNlUgxhY2NlcHRzUkpzb24SNwoOZ2xvYmFsX29wdGFyZ3MYBiADKAsyEC5RdWVyeS5Bc3NvY1BhaXJSDWdsb2JhbE9wdGFyZ3MaNgoJQXNzb2NQYWlyEhAKA2tleRgBIAEoCVIDa2V5EhcKA3ZhbBgCIAEoCzIFLlRlcm1SA3ZhbCJRCglRdWVyeVR5cGUSCQoFU1RBUlQQARIMCghDT05USU5VRRACEggKBFNUT1AQAxIQCgxOT1JFUExZX1dBSVQQBBIPCgtTRVJWRVJfSU5GTxAF'); +@$core.Deprecated('Use frameDescriptor instead') +const Frame$json = { + '1': 'Frame', + '2': [ + { + '1': 'type', + '3': 1, + '4': 1, + '5': 14, + '6': '.Frame.FrameType', + '10': 'type' + }, + {'1': 'pos', '3': 2, '4': 1, '5': 3, '10': 'pos'}, + {'1': 'opt', '3': 3, '4': 1, '5': 9, '10': 'opt'}, + ], + '4': [Frame_FrameType$json], +}; + +@$core.Deprecated('Use frameDescriptor instead') +const Frame_FrameType$json = { + '1': 'FrameType', + '2': [ + {'1': 'POS', '2': 1}, + {'1': 'OPT', '2': 2}, + ], +}; + +/// Descriptor for `Frame`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List frameDescriptor = $convert.base64Decode( + 'CgVGcmFtZRIkCgR0eXBlGAEgASgOMhAuRnJhbWUuRnJhbWVUeXBlUgR0eXBlEhAKA3BvcxgCIAEoA1IDcG9zEhAKA29wdBgDIAEoCVIDb3B0Ih0KCUZyYW1lVHlwZRIHCgNQT1MQARIHCgNPUFQQAg=='); +@$core.Deprecated('Use backtraceDescriptor instead') +const Backtrace$json = { + '1': 'Backtrace', + '2': [ + {'1': 'frames', '3': 1, '4': 3, '5': 11, '6': '.Frame', '10': 'frames'}, + ], +}; + +/// Descriptor for `Backtrace`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List backtraceDescriptor = $convert.base64Decode( + 'CglCYWNrdHJhY2USHgoGZnJhbWVzGAEgAygLMgYuRnJhbWVSBmZyYW1lcw=='); +@$core.Deprecated('Use responseDescriptor instead') +const Response$json = { + '1': 'Response', + '2': [ + { + '1': 'type', + '3': 1, + '4': 1, + '5': 14, + '6': '.Response.ResponseType', + '10': 'type' + }, + { + '1': 'error_type', + '3': 7, + '4': 1, + '5': 14, + '6': '.Response.ErrorType', + '10': 'errorType' + }, + { + '1': 'notes', + '3': 6, + '4': 3, + '5': 14, + '6': '.Response.ResponseNote', + '10': 'notes' + }, + {'1': 'token', '3': 2, '4': 1, '5': 3, '10': 'token'}, + {'1': 'response', '3': 3, '4': 3, '5': 11, '6': '.Datum', '10': 'response'}, + { + '1': 'backtrace', + '3': 4, + '4': 1, + '5': 11, + '6': '.Backtrace', + '10': 'backtrace' + }, + {'1': 'profile', '3': 5, '4': 1, '5': 11, '6': '.Datum', '10': 'profile'}, + ], + '4': [ + Response_ResponseType$json, + Response_ErrorType$json, + Response_ResponseNote$json + ], +}; + +@$core.Deprecated('Use responseDescriptor instead') +const Response_ResponseType$json = { + '1': 'ResponseType', + '2': [ + {'1': 'SUCCESS_ATOM', '2': 1}, + {'1': 'SUCCESS_SEQUENCE', '2': 2}, + {'1': 'SUCCESS_PARTIAL', '2': 3}, + {'1': 'WAIT_COMPLETE', '2': 4}, + {'1': 'SERVER_INFO', '2': 5}, + {'1': 'CLIENT_ERROR', '2': 16}, + {'1': 'COMPILE_ERROR', '2': 17}, + {'1': 'RUNTIME_ERROR', '2': 18}, + ], +}; + +@$core.Deprecated('Use responseDescriptor instead') +const Response_ErrorType$json = { + '1': 'ErrorType', + '2': [ + {'1': 'INTERNAL', '2': 1000000}, + {'1': 'RESOURCE_LIMIT', '2': 2000000}, + {'1': 'QUERY_LOGIC', '2': 3000000}, + {'1': 'NON_EXISTENCE', '2': 3100000}, + {'1': 'OP_FAILED', '2': 4100000}, + {'1': 'OP_INDETERMINATE', '2': 4200000}, + {'1': 'USER', '2': 5000000}, + {'1': 'PERMISSION_ERROR', '2': 6000000}, + ], +}; + +@$core.Deprecated('Use responseDescriptor instead') +const Response_ResponseNote$json = { + '1': 'ResponseNote', + '2': [ + {'1': 'SEQUENCE_FEED', '2': 1}, + {'1': 'ATOM_FEED', '2': 2}, + {'1': 'ORDER_BY_LIMIT_FEED', '2': 3}, + {'1': 'UNIONED_FEED', '2': 4}, + {'1': 'INCLUDES_STATES', '2': 5}, + ], +}; + +/// Descriptor for `Response`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List responseDescriptor = $convert.base64Decode( + 'CghSZXNwb25zZRIqCgR0eXBlGAEgASgOMhYuUmVzcG9uc2UuUmVzcG9uc2VUeXBlUgR0eXBlEjIKCmVycm9yX3R5cGUYByABKA4yEy5SZXNwb25zZS5FcnJvclR5cGVSCWVycm9yVHlwZRIsCgVub3RlcxgGIAMoDjIWLlJlc3BvbnNlLlJlc3BvbnNlTm90ZVIFbm90ZXMSFAoFdG9rZW4YAiABKANSBXRva2VuEiIKCHJlc3BvbnNlGAMgAygLMgYuRGF0dW1SCHJlc3BvbnNlEigKCWJhY2t0cmFjZRgEIAEoCzIKLkJhY2t0cmFjZVIJYmFja3RyYWNlEiAKB3Byb2ZpbGUYBSABKAsyBi5EYXR1bVIHcHJvZmlsZSKnAQoMUmVzcG9uc2VUeXBlEhAKDFNVQ0NFU1NfQVRPTRABEhQKEFNVQ0NFU1NfU0VRVUVOQ0UQAhITCg9TVUNDRVNTX1BBUlRJQUwQAxIRCg1XQUlUX0NPTVBMRVRFEAQSDwoLU0VSVkVSX0lORk8QBRIQCgxDTElFTlRfRVJST1IQEBIRCg1DT01QSUxFX0VSUk9SEBESEQoNUlVOVElNRV9FUlJPUhASIqwBCglFcnJvclR5cGUSDgoISU5URVJOQUwQwIQ9EhQKDlJFU09VUkNFX0xJTUlUEICJehISCgtRVUVSWV9MT0dJQxDAjbcBEhQKDU5PTl9FWElTVEVOQ0UQ4Jq9ARIQCglPUF9GQUlMRUQQoJ/6ARIXChBPUF9JTkRFVEVSTUlOQVRFEMCsgAISCwoEVVNFUhDAlrECEhcKEFBFUk1JU1NJT05fRVJST1IQgJvuAiJwCgxSZXNwb25zZU5vdGUSEQoNU0VRVUVOQ0VfRkVFRBABEg0KCUFUT01fRkVFRBACEhcKE09SREVSX0JZX0xJTUlUX0ZFRUQQAxIQCgxVTklPTkVEX0ZFRUQQBBITCg9JTkNMVURFU19TVEFURVMQBQ=='); +@$core.Deprecated('Use datumDescriptor instead') +const Datum$json = { + '1': 'Datum', + '2': [ + { + '1': 'type', + '3': 1, + '4': 1, + '5': 14, + '6': '.Datum.DatumType', + '10': 'type' + }, + {'1': 'r_bool', '3': 2, '4': 1, '5': 8, '10': 'rBool'}, + {'1': 'r_num', '3': 3, '4': 1, '5': 1, '10': 'rNum'}, + {'1': 'r_str', '3': 4, '4': 1, '5': 9, '10': 'rStr'}, + {'1': 'r_array', '3': 5, '4': 3, '5': 11, '6': '.Datum', '10': 'rArray'}, + { + '1': 'r_object', + '3': 6, + '4': 3, + '5': 11, + '6': '.Datum.AssocPair', + '10': 'rObject' + }, + ], + '3': [Datum_AssocPair$json], + '4': [Datum_DatumType$json], +}; + +@$core.Deprecated('Use datumDescriptor instead') +const Datum_AssocPair$json = { + '1': 'AssocPair', + '2': [ + {'1': 'key', '3': 1, '4': 1, '5': 9, '10': 'key'}, + {'1': 'val', '3': 2, '4': 1, '5': 11, '6': '.Datum', '10': 'val'}, + ], +}; + +@$core.Deprecated('Use datumDescriptor instead') +const Datum_DatumType$json = { + '1': 'DatumType', + '2': [ + {'1': 'R_NULL', '2': 1}, + {'1': 'R_BOOL', '2': 2}, + {'1': 'R_NUM', '2': 3}, + {'1': 'R_STR', '2': 4}, + {'1': 'R_ARRAY', '2': 5}, + {'1': 'R_OBJECT', '2': 6}, + {'1': 'R_JSON', '2': 7}, + ], +}; + +/// Descriptor for `Datum`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List datumDescriptor = $convert.base64Decode( + 'CgVEYXR1bRIkCgR0eXBlGAEgASgOMhAuRGF0dW0uRGF0dW1UeXBlUgR0eXBlEhUKBnJfYm9vbBgCIAEoCFIFckJvb2wSEwoFcl9udW0YAyABKAFSBHJOdW0SEwoFcl9zdHIYBCABKAlSBHJTdHISHwoHcl9hcnJheRgFIAMoCzIGLkRhdHVtUgZyQXJyYXkSKwoIcl9vYmplY3QYBiADKAsyEC5EYXR1bS5Bc3NvY1BhaXJSB3JPYmplY3QaNwoJQXNzb2NQYWlyEhAKA2tleRgBIAEoCVIDa2V5EhgKA3ZhbBgCIAEoCzIGLkRhdHVtUgN2YWwiYAoJRGF0dW1UeXBlEgoKBlJfTlVMTBABEgoKBlJfQk9PTBACEgkKBVJfTlVNEAMSCQoFUl9TVFIQBBILCgdSX0FSUkFZEAUSDAoIUl9PQkpFQ1QQBhIKCgZSX0pTT04QBw=='); +@$core.Deprecated('Use termDescriptor instead') +const Term$json = { + '1': 'Term', + '2': [ + {'1': 'type', '3': 1, '4': 1, '5': 14, '6': '.Term.TermType', '10': 'type'}, + {'1': 'datum', '3': 2, '4': 1, '5': 11, '6': '.Datum', '10': 'datum'}, + {'1': 'args', '3': 3, '4': 3, '5': 11, '6': '.Term', '10': 'args'}, + { + '1': 'optargs', + '3': 4, + '4': 3, + '5': 11, + '6': '.Term.AssocPair', + '10': 'optargs' + }, + ], + '3': [Term_AssocPair$json], + '4': [Term_TermType$json], +}; + +@$core.Deprecated('Use termDescriptor instead') +const Term_AssocPair$json = { + '1': 'AssocPair', + '2': [ + {'1': 'key', '3': 1, '4': 1, '5': 9, '10': 'key'}, + {'1': 'val', '3': 2, '4': 1, '5': 11, '6': '.Term', '10': 'val'}, + ], +}; + +@$core.Deprecated('Use termDescriptor instead') +const Term_TermType$json = { + '1': 'TermType', + '2': [ + {'1': 'DATUM', '2': 1}, + {'1': 'MAKE_ARRAY', '2': 2}, + {'1': 'MAKE_OBJ', '2': 3}, + {'1': 'VAR', '2': 10}, + {'1': 'JAVASCRIPT', '2': 11}, + {'1': 'UUID', '2': 169}, + {'1': 'HTTP', '2': 153}, + {'1': 'ERROR', '2': 12}, + {'1': 'IMPLICIT_VAR', '2': 13}, + {'1': 'DB', '2': 14}, + {'1': 'TABLE', '2': 15}, + {'1': 'GET', '2': 16}, + {'1': 'GET_ALL', '2': 78}, + {'1': 'EQ', '2': 17}, + {'1': 'NE', '2': 18}, + {'1': 'LT', '2': 19}, + {'1': 'LE', '2': 20}, + {'1': 'GT', '2': 21}, + {'1': 'GE', '2': 22}, + {'1': 'NOT', '2': 23}, + {'1': 'ADD', '2': 24}, + {'1': 'SUB', '2': 25}, + {'1': 'MUL', '2': 26}, + {'1': 'DIV', '2': 27}, + {'1': 'MOD', '2': 28}, + {'1': 'FLOOR', '2': 183}, + {'1': 'CEIL', '2': 184}, + {'1': 'ROUND', '2': 185}, + {'1': 'APPEND', '2': 29}, + {'1': 'PREPEND', '2': 80}, + {'1': 'DIFFERENCE', '2': 95}, + {'1': 'SET_INSERT', '2': 88}, + {'1': 'SET_INTERSECTION', '2': 89}, + {'1': 'SET_UNION', '2': 90}, + {'1': 'SET_DIFFERENCE', '2': 91}, + {'1': 'SLICE', '2': 30}, + {'1': 'SKIP', '2': 70}, + {'1': 'LIMIT', '2': 71}, + {'1': 'OFFSETS_OF', '2': 87}, + {'1': 'CONTAINS', '2': 93}, + {'1': 'GET_FIELD', '2': 31}, + {'1': 'KEYS', '2': 94}, + {'1': 'VALUES', '2': 186}, + {'1': 'OBJECT', '2': 143}, + {'1': 'HAS_FIELDS', '2': 32}, + {'1': 'WITH_FIELDS', '2': 96}, + {'1': 'PLUCK', '2': 33}, + {'1': 'WITHOUT', '2': 34}, + {'1': 'MERGE', '2': 35}, + {'1': 'BETWEEN_DEPRECATED', '2': 36}, + {'1': 'BETWEEN', '2': 182}, + {'1': 'REDUCE', '2': 37}, + {'1': 'MAP', '2': 38}, + {'1': 'FOLD', '2': 187}, + {'1': 'FILTER', '2': 39}, + {'1': 'CONCAT_MAP', '2': 40}, + {'1': 'ORDER_BY', '2': 41}, + {'1': 'DISTINCT', '2': 42}, + {'1': 'COUNT', '2': 43}, + {'1': 'IS_EMPTY', '2': 86}, + {'1': 'UNION', '2': 44}, + {'1': 'NTH', '2': 45}, + {'1': 'BRACKET', '2': 170}, + {'1': 'INNER_JOIN', '2': 48}, + {'1': 'OUTER_JOIN', '2': 49}, + {'1': 'EQ_JOIN', '2': 50}, + {'1': 'ZIP', '2': 72}, + {'1': 'RANGE', '2': 173}, + {'1': 'INSERT_AT', '2': 82}, + {'1': 'DELETE_AT', '2': 83}, + {'1': 'CHANGE_AT', '2': 84}, + {'1': 'SPLICE_AT', '2': 85}, + {'1': 'COERCE_TO', '2': 51}, + {'1': 'TYPE_OF', '2': 52}, + {'1': 'UPDATE', '2': 53}, + {'1': 'DELETE', '2': 54}, + {'1': 'REPLACE', '2': 55}, + {'1': 'INSERT', '2': 56}, + {'1': 'DB_CREATE', '2': 57}, + {'1': 'DB_DROP', '2': 58}, + {'1': 'DB_LIST', '2': 59}, + {'1': 'TABLE_CREATE', '2': 60}, + {'1': 'TABLE_DROP', '2': 61}, + {'1': 'TABLE_LIST', '2': 62}, + {'1': 'CONFIG', '2': 174}, + {'1': 'STATUS', '2': 175}, + {'1': 'WAIT', '2': 177}, + {'1': 'RECONFIGURE', '2': 176}, + {'1': 'REBALANCE', '2': 179}, + {'1': 'SYNC', '2': 138}, + {'1': 'GRANT', '2': 188}, + {'1': 'INDEX_CREATE', '2': 75}, + {'1': 'INDEX_DROP', '2': 76}, + {'1': 'INDEX_LIST', '2': 77}, + {'1': 'INDEX_STATUS', '2': 139}, + {'1': 'INDEX_WAIT', '2': 140}, + {'1': 'INDEX_RENAME', '2': 156}, + {'1': 'SET_WRITE_HOOK', '2': 189}, + {'1': 'GET_WRITE_HOOK', '2': 190}, + {'1': 'FUNCALL', '2': 64}, + {'1': 'BRANCH', '2': 65}, + {'1': 'OR', '2': 66}, + {'1': 'AND', '2': 67}, + {'1': 'FOR_EACH', '2': 68}, + {'1': 'FUNC', '2': 69}, + {'1': 'ASC', '2': 73}, + {'1': 'DESC', '2': 74}, + {'1': 'INFO', '2': 79}, + {'1': 'MATCH', '2': 97}, + {'1': 'UPCASE', '2': 141}, + {'1': 'DOWNCASE', '2': 142}, + {'1': 'SAMPLE', '2': 81}, + {'1': 'DEFAULT', '2': 92}, + {'1': 'JSON', '2': 98}, + {'1': 'ISO8601', '2': 99}, + {'1': 'TO_ISO8601', '2': 100}, + {'1': 'EPOCH_TIME', '2': 101}, + {'1': 'TO_EPOCH_TIME', '2': 102}, + {'1': 'NOW', '2': 103}, + {'1': 'IN_TIMEZONE', '2': 104}, + {'1': 'DURING', '2': 105}, + {'1': 'DATE', '2': 106}, + {'1': 'TIME_OF_DAY', '2': 126}, + {'1': 'TIMEZONE', '2': 127}, + {'1': 'YEAR', '2': 128}, + {'1': 'MONTH', '2': 129}, + {'1': 'DAY', '2': 130}, + {'1': 'DAY_OF_WEEK', '2': 131}, + {'1': 'DAY_OF_YEAR', '2': 132}, + {'1': 'HOURS', '2': 133}, + {'1': 'MINUTES', '2': 134}, + {'1': 'SECONDS', '2': 135}, + {'1': 'TIME', '2': 136}, + {'1': 'MONDAY', '2': 107}, + {'1': 'TUESDAY', '2': 108}, + {'1': 'WEDNESDAY', '2': 109}, + {'1': 'THURSDAY', '2': 110}, + {'1': 'FRIDAY', '2': 111}, + {'1': 'SATURDAY', '2': 112}, + {'1': 'SUNDAY', '2': 113}, + {'1': 'JANUARY', '2': 114}, + {'1': 'FEBRUARY', '2': 115}, + {'1': 'MARCH', '2': 116}, + {'1': 'APRIL', '2': 117}, + {'1': 'MAY', '2': 118}, + {'1': 'JUNE', '2': 119}, + {'1': 'JULY', '2': 120}, + {'1': 'AUGUST', '2': 121}, + {'1': 'SEPTEMBER', '2': 122}, + {'1': 'OCTOBER', '2': 123}, + {'1': 'NOVEMBER', '2': 124}, + {'1': 'DECEMBER', '2': 125}, + {'1': 'LITERAL', '2': 137}, + {'1': 'GROUP', '2': 144}, + {'1': 'SUM', '2': 145}, + {'1': 'AVG', '2': 146}, + {'1': 'MIN', '2': 147}, + {'1': 'MAX', '2': 148}, + {'1': 'SPLIT', '2': 149}, + {'1': 'UNGROUP', '2': 150}, + {'1': 'RANDOM', '2': 151}, + {'1': 'CHANGES', '2': 152}, + {'1': 'ARGS', '2': 154}, + {'1': 'BINARY', '2': 155}, + {'1': 'GEOJSON', '2': 157}, + {'1': 'TO_GEOJSON', '2': 158}, + {'1': 'POINT', '2': 159}, + {'1': 'LINE', '2': 160}, + {'1': 'POLYGON', '2': 161}, + {'1': 'DISTANCE', '2': 162}, + {'1': 'INTERSECTS', '2': 163}, + {'1': 'INCLUDES', '2': 164}, + {'1': 'CIRCLE', '2': 165}, + {'1': 'GET_INTERSECTING', '2': 166}, + {'1': 'FILL', '2': 167}, + {'1': 'GET_NEAREST', '2': 168}, + {'1': 'POLYGON_SUB', '2': 171}, + {'1': 'TO_JSON_STRING', '2': 172}, + {'1': 'MINVAL', '2': 180}, + {'1': 'MAXVAL', '2': 181}, + {'1': 'BIT_AND', '2': 191}, + {'1': 'BIT_OR', '2': 192}, + {'1': 'BIT_XOR', '2': 193}, + {'1': 'BIT_NOT', '2': 194}, + {'1': 'BIT_SAL', '2': 195}, + {'1': 'BIT_SAR', '2': 196}, + ], +}; + +/// Descriptor for `Term`. Decode as a `google.protobuf.DescriptorProto`. +final $typed_data.Uint8List termDescriptor = $convert.base64Decode( + 'CgRUZXJtEiIKBHR5cGUYASABKA4yDi5UZXJtLlRlcm1UeXBlUgR0eXBlEhwKBWRhdHVtGAIgASgLMgYuRGF0dW1SBWRhdHVtEhkKBGFyZ3MYAyADKAsyBS5UZXJtUgRhcmdzEikKB29wdGFyZ3MYBCADKAsyDy5UZXJtLkFzc29jUGFpclIHb3B0YXJncxo2CglBc3NvY1BhaXISEAoDa2V5GAEgASgJUgNrZXkSFwoDdmFsGAIgASgLMgUuVGVybVIDdmFsIp0TCghUZXJtVHlwZRIJCgVEQVRVTRABEg4KCk1BS0VfQVJSQVkQAhIMCghNQUtFX09CShADEgcKA1ZBUhAKEg4KCkpBVkFTQ1JJUFQQCxIJCgRVVUlEEKkBEgkKBEhUVFAQmQESCQoFRVJST1IQDBIQCgxJTVBMSUNJVF9WQVIQDRIGCgJEQhAOEgkKBVRBQkxFEA8SBwoDR0VUEBASCwoHR0VUX0FMTBBOEgYKAkVREBESBgoCTkUQEhIGCgJMVBATEgYKAkxFEBQSBgoCR1QQFRIGCgJHRRAWEgcKA05PVBAXEgcKA0FERBAYEgcKA1NVQhAZEgcKA01VTBAaEgcKA0RJVhAbEgcKA01PRBAcEgoKBUZMT09SELcBEgkKBENFSUwQuAESCgoFUk9VTkQQuQESCgoGQVBQRU5EEB0SCwoHUFJFUEVORBBQEg4KCkRJRkZFUkVOQ0UQXxIOCgpTRVRfSU5TRVJUEFgSFAoQU0VUX0lOVEVSU0VDVElPThBZEg0KCVNFVF9VTklPThBaEhIKDlNFVF9ESUZGRVJFTkNFEFsSCQoFU0xJQ0UQHhIICgRTS0lQEEYSCQoFTElNSVQQRxIOCgpPRkZTRVRTX09GEFcSDAoIQ09OVEFJTlMQXRINCglHRVRfRklFTEQQHxIICgRLRVlTEF4SCwoGVkFMVUVTELoBEgsKBk9CSkVDVBCPARIOCgpIQVNfRklFTERTECASDwoLV0lUSF9GSUVMRFMQYBIJCgVQTFVDSxAhEgsKB1dJVEhPVVQQIhIJCgVNRVJHRRAjEhYKEkJFVFdFRU5fREVQUkVDQVRFRBAkEgwKB0JFVFdFRU4QtgESCgoGUkVEVUNFECUSBwoDTUFQECYSCQoERk9MRBC7ARIKCgZGSUxURVIQJxIOCgpDT05DQVRfTUFQECgSDAoIT1JERVJfQlkQKRIMCghESVNUSU5DVBAqEgkKBUNPVU5UECsSDAoISVNfRU1QVFkQVhIJCgVVTklPThAsEgcKA05USBAtEgwKB0JSQUNLRVQQqgESDgoKSU5ORVJfSk9JThAwEg4KCk9VVEVSX0pPSU4QMRILCgdFUV9KT0lOEDISBwoDWklQEEgSCgoFUkFOR0UQrQESDQoJSU5TRVJUX0FUEFISDQoJREVMRVRFX0FUEFMSDQoJQ0hBTkdFX0FUEFQSDQoJU1BMSUNFX0FUEFUSDQoJQ09FUkNFX1RPEDMSCwoHVFlQRV9PRhA0EgoKBlVQREFURRA1EgoKBkRFTEVURRA2EgsKB1JFUExBQ0UQNxIKCgZJTlNFUlQQOBINCglEQl9DUkVBVEUQORILCgdEQl9EUk9QEDoSCwoHREJfTElTVBA7EhAKDFRBQkxFX0NSRUFURRA8Eg4KClRBQkxFX0RST1AQPRIOCgpUQUJMRV9MSVNUED4SCwoGQ09ORklHEK4BEgsKBlNUQVRVUxCvARIJCgRXQUlUELEBEhAKC1JFQ09ORklHVVJFELABEg4KCVJFQkFMQU5DRRCzARIJCgRTWU5DEIoBEgoKBUdSQU5UELwBEhAKDElOREVYX0NSRUFURRBLEg4KCklOREVYX0RST1AQTBIOCgpJTkRFWF9MSVNUEE0SEQoMSU5ERVhfU1RBVFVTEIsBEg8KCklOREVYX1dBSVQQjAESEQoMSU5ERVhfUkVOQU1FEJwBEhMKDlNFVF9XUklURV9IT09LEL0BEhMKDkdFVF9XUklURV9IT09LEL4BEgsKB0ZVTkNBTEwQQBIKCgZCUkFOQ0gQQRIGCgJPUhBCEgcKA0FORBBDEgwKCEZPUl9FQUNIEEQSCAoERlVOQxBFEgcKA0FTQxBJEggKBERFU0MQShIICgRJTkZPEE8SCQoFTUFUQ0gQYRILCgZVUENBU0UQjQESDQoIRE9XTkNBU0UQjgESCgoGU0FNUExFEFESCwoHREVGQVVMVBBcEggKBEpTT04QYhILCgdJU084NjAxEGMSDgoKVE9fSVNPODYwMRBkEg4KCkVQT0NIX1RJTUUQZRIRCg1UT19FUE9DSF9USU1FEGYSBwoDTk9XEGcSDwoLSU5fVElNRVpPTkUQaBIKCgZEVVJJTkcQaRIICgREQVRFEGoSDwoLVElNRV9PRl9EQVkQfhIMCghUSU1FWk9ORRB/EgkKBFlFQVIQgAESCgoFTU9OVEgQgQESCAoDREFZEIIBEhAKC0RBWV9PRl9XRUVLEIMBEhAKC0RBWV9PRl9ZRUFSEIQBEgoKBUhPVVJTEIUBEgwKB01JTlVURVMQhgESDAoHU0VDT05EUxCHARIJCgRUSU1FEIgBEgoKBk1PTkRBWRBrEgsKB1RVRVNEQVkQbBINCglXRURORVNEQVkQbRIMCghUSFVSU0RBWRBuEgoKBkZSSURBWRBvEgwKCFNBVFVSREFZEHASCgoGU1VOREFZEHESCwoHSkFOVUFSWRByEgwKCEZFQlJVQVJZEHMSCQoFTUFSQ0gQdBIJCgVBUFJJTBB1EgcKA01BWRB2EggKBEpVTkUQdxIICgRKVUxZEHgSCgoGQVVHVVNUEHkSDQoJU0VQVEVNQkVSEHoSCwoHT0NUT0JFUhB7EgwKCE5PVkVNQkVSEHwSDAoIREVDRU1CRVIQfRIMCgdMSVRFUkFMEIkBEgoKBUdST1VQEJABEggKA1NVTRCRARIICgNBVkcQkgESCAoDTUlOEJMBEggKA01BWBCUARIKCgVTUExJVBCVARIMCgdVTkdST1VQEJYBEgsKBlJBTkRPTRCXARIMCgdDSEFOR0VTEJgBEgkKBEFSR1MQmgESCwoGQklOQVJZEJsBEgwKB0dFT0pTT04QnQESDwoKVE9fR0VPSlNPThCeARIKCgVQT0lOVBCfARIJCgRMSU5FEKABEgwKB1BPTFlHT04QoQESDQoIRElTVEFOQ0UQogESDwoKSU5URVJTRUNUUxCjARINCghJTkNMVURFUxCkARILCgZDSVJDTEUQpQESFQoQR0VUX0lOVEVSU0VDVElORxCmARIJCgRGSUxMEKcBEhAKC0dFVF9ORUFSRVNUEKgBEhAKC1BPTFlHT05fU1VCEKsBEhMKDlRPX0pTT05fU1RSSU5HEKwBEgsKBk1JTlZBTBC0ARILCgZNQVhWQUwQtQESDAoHQklUX0FORBC/ARILCgZCSVRfT1IQwAESDAoHQklUX1hPUhDBARIMCgdCSVRfTk9UEMIBEgwKB0JJVF9TQUwQwwESDAoHQklUX1NBUhDEAQ=='); diff --git a/drivers/rethinkdb/lib/src/generated/ql2.pbserver.dart b/drivers/rethinkdb/lib/src/generated/ql2.pbserver.dart new file mode 100644 index 0000000..8b4164b --- /dev/null +++ b/drivers/rethinkdb/lib/src/generated/ql2.pbserver.dart @@ -0,0 +1,7 @@ +// Generated code. Do not modify. +// source: ql2.proto +// +// @dart = 2.12 +// ignore_for_file: annotate_overrides,camel_case_types,unnecessary_const,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type,unnecessary_this,prefer_final_fields,deprecated_member_use_from_same_package + +export 'ql2.pb.dart'; diff --git a/drivers/rethinkdb/lib/src/generated/ql2.proto b/drivers/rethinkdb/lib/src/generated/ql2.proto new file mode 100644 index 0000000..fa16859 --- /dev/null +++ b/drivers/rethinkdb/lib/src/generated/ql2.proto @@ -0,0 +1,862 @@ +//////////////////////////////////////////////////////////////////////////////// +// THE HIGH-LEVEL VIEW // +//////////////////////////////////////////////////////////////////////////////// + +// Process: When you first open a connection, send the magic number +// for the version of the protobuf you're targeting (in the [Version] +// enum). This should **NOT** be sent as a protobuf; just send the +// little-endian 32-bit integer over the wire raw. This number should +// only be sent once per connection. + +// The magic number shall be followed by an authorization key. The +// first 4 bytes are the length of the key to be sent as a little-endian +// 32-bit integer, followed by the key string. Even if there is no key, +// an empty string should be sent (length 0 and no data). + +// Following the authorization key, the client shall send a magic number +// for the communication protocol they want to use (in the [Protocol] +// enum). This shall be a little-endian 32-bit integer. + +// The server will then respond with a NULL-terminated string response. +// "SUCCESS" indicates that the connection has been accepted. Any other +// response indicates an error, and the response string should describe +// the error. + +// Next, for each query you want to send, construct a [Query] protobuf +// and serialize it to a binary blob. Send the blob's size to the +// server encoded as a little-endian 32-bit integer, followed by the +// blob itself. You will recieve a [Response] protobuf back preceded +// by its own size, once again encoded as a little-endian 32-bit +// integer. You can see an example exchange below in **EXAMPLE**. + +// A query consists of a [Term] to evaluate and a unique-per-connection +// [token]. + +// Tokens are used for two things: +// * Keeping track of which responses correspond to which queries. +// * Batched queries. Some queries return lots of results, so we send back +// batches of <1000, and you need to send a [CONTINUE] query with the same +// token to get more results from the original query. +//////////////////////////////////////////////////////////////////////////////// + +syntax = "proto2"; + +message VersionDummy { // We need to wrap it like this for some + // non-conforming protobuf libraries + // This enum contains the magic numbers for your version. See **THE HIGH-LEVEL + // VIEW** for what to do with it. + enum Version { + V0_1 = 0x3f61ba36; + V0_2 = 0x723081e1; // Authorization key during handshake + V0_3 = 0x5f75e83e; // Authorization key and protocol during handshake + V0_4 = 0x400c2d20; // Queries execute in parallel + V1_0 = 0x34c2bdc3; // Users and permissions + } + + // The protocol to use after the handshake, specified in V0_3 + enum Protocol { + PROTOBUF = 0x271ffc41; + JSON = 0x7e6970c7; + } +} + +// You send one of: +// * A [START] query with a [Term] to evaluate and a unique-per-connection token. +// * A [CONTINUE] query with the same token as a [START] query that returned +// [SUCCESS_PARTIAL] in its [Response]. +// * A [STOP] query with the same token as a [START] query that you want to stop. +// * A [NOREPLY_WAIT] query with a unique per-connection token. The server answers +// with a [WAIT_COMPLETE] [Response]. +// * A [SERVER_INFO] query. The server answers with a [SERVER_INFO] [Response]. +message Query { + enum QueryType { + START = 1; // Start a new query. + CONTINUE = 2; // Continue a query that returned [SUCCESS_PARTIAL] + // (see [Response]). + STOP = 3; // Stop a query partway through executing. + NOREPLY_WAIT = 4; // Wait for noreply operations to finish. + SERVER_INFO = 5; // Get server information. + } + optional QueryType type = 1; + // A [Term] is how we represent the operations we want a query to perform. + optional Term query = 2; // only present when [type] = [START] + optional int64 token = 3; + // This flag is ignored on the server. `noreply` should be added + // to `global_optargs` instead (the key "noreply" should map to + // either true or false). + optional bool OBSOLETE_noreply = 4 [default = false]; + + // If this is set to [true], then [Datum] values will sometimes be + // of [DatumType] [R_JSON] (see below). This can provide enormous + // speedups in languages with poor protobuf libraries. + optional bool accepts_r_json = 5 [default = false]; + + message AssocPair { + optional string key = 1; + optional Term val = 2; + } + repeated AssocPair global_optargs = 6; +} + +// A backtrace frame (see `backtrace` in Response below) +message Frame { + enum FrameType { + POS = 1; // Error occurred in a positional argument. + OPT = 2; // Error occurred in an optional argument. + } + optional FrameType type = 1; + optional int64 pos = 2; // The index of the positional argument. + optional string opt = 3; // The name of the optional argument. +} +message Backtrace { + repeated Frame frames = 1; +} + +// You get back a response with the same [token] as your query. +message Response { + enum ResponseType { + // These response types indicate success. + SUCCESS_ATOM = 1; // Query returned a single RQL datatype. + SUCCESS_SEQUENCE = 2; // Query returned a sequence of RQL datatypes. + SUCCESS_PARTIAL = 3; // Query returned a partial sequence of RQL + // datatypes. If you send a [CONTINUE] query with + // the same token as this response, you will get + // more of the sequence. Keep sending [CONTINUE] + // queries until you get back [SUCCESS_SEQUENCE]. + WAIT_COMPLETE = 4; // A [NOREPLY_WAIT] query completed. + SERVER_INFO = 5; // The data for a [SERVER_INFO] request. This is + // the same as `SUCCESS_ATOM` except that there will + // never be profiling data. + + // These response types indicate failure. + CLIENT_ERROR = 16; // Means the client is buggy. An example is if the + // client sends a malformed protobuf, or tries to + // send [CONTINUE] for an unknown token. + COMPILE_ERROR = 17; // Means the query failed during parsing or type + // checking. For example, if you pass too many + // arguments to a function. + RUNTIME_ERROR = 18; // Means the query failed at runtime. An example is + // if you add together two values from a table, but + // they turn out at runtime to be booleans rather + // than numbers. + } + optional ResponseType type = 1; + + // If `ResponseType` is `RUNTIME_ERROR`, this may be filled in with more + // information about the error. + enum ErrorType { + INTERNAL = 1000000; + RESOURCE_LIMIT = 2000000; + QUERY_LOGIC = 3000000; + NON_EXISTENCE = 3100000; + OP_FAILED = 4100000; + OP_INDETERMINATE = 4200000; + USER = 5000000; + PERMISSION_ERROR = 6000000; + } + optional ErrorType error_type = 7; + + // ResponseNotes are used to provide information about the query + // response that may be useful for people writing drivers or ORMs. + // Currently all the notes we send indicate that a stream has certain + // special properties. + enum ResponseNote { + // The stream is a changefeed stream (e.g. `r.table('test').changes()`). + SEQUENCE_FEED = 1; + // The stream is a point changefeed stream + // (e.g. `r.table('test').get(0).changes()`). + ATOM_FEED = 2; + // The stream is an order_by_limit changefeed stream + // (e.g. `r.table('test').order_by(index: 'id').limit(5).changes()`). + ORDER_BY_LIMIT_FEED = 3; + // The stream is a union of multiple changefeed types that can't be + // collapsed to a single type + // (e.g. `r.table('test').changes().union(r.table('test').get(0).changes())`). + UNIONED_FEED = 4; + // The stream is a changefeed stream and includes notes on what state + // the changefeed stream is in (e.g. objects of the form `{state: + // 'initializing'}`). + INCLUDES_STATES = 5; + } + repeated ResponseNote notes = 6; + + optional int64 token = 2; // Indicates what [Query] this response corresponds to. + + // [response] contains 1 RQL datum if [type] is [SUCCESS_ATOM] or + // [SERVER_INFO]. [response] contains many RQL data if [type] is + // [SUCCESS_SEQUENCE] or [SUCCESS_PARTIAL]. [response] contains 1 + // error message (of type [R_STR]) in all other cases. + repeated Datum response = 3; + + // If [type] is [CLIENT_ERROR], [TYPE_ERROR], or [RUNTIME_ERROR], then a + // backtrace will be provided. The backtrace says where in the query the + // error occurred. Ideally this information will be presented to the user as + // a pretty-printed version of their query with the erroneous section + // underlined. A backtrace is a series of 0 or more [Frame]s, each of which + // specifies either the index of a positional argument or the name of an + // optional argument. (Those words will make more sense if you look at the + // [Term] message below.) + optional Backtrace backtrace = 4; // Contains n [Frame]s when you get back an error. + + // If the [global_optargs] in the [Query] that this [Response] is a + // response to contains a key "profile" which maps to a static value of + // true then [profile] will contain a [Datum] which provides profiling + // information about the execution of the query. This field should be + // returned to the user along with the result that would normally be + // returned (a datum or a cursor). In official drivers this is accomplished + // by putting them inside of an object with "value" mapping to the return + // value and "profile" mapping to the profile object. + optional Datum profile = 5; +} + +// A [Datum] is a chunk of data that can be serialized to disk or returned to +// the user in a Response. Currently we only support JSON types, but we may +// support other types in the future (e.g., a date type or an integer type). +message Datum { + enum DatumType { + R_NULL = 1; + R_BOOL = 2; + R_NUM = 3; // a double + R_STR = 4; + R_ARRAY = 5; + R_OBJECT = 6; + // This [DatumType] will only be used if [accepts_r_json] is + // set to [true] in [Query]. [r_str] will be filled with a + // JSON encoding of the [Datum]. + R_JSON = 7; // uses r_str + } + optional DatumType type = 1; + optional bool r_bool = 2; + optional double r_num = 3; + optional string r_str = 4; + + repeated Datum r_array = 5; + message AssocPair { + optional string key = 1; + optional Datum val = 2; + } + repeated AssocPair r_object = 6; +} + +// A [Term] is either a piece of data (see **Datum** above), or an operator and +// its operands. If you have a [Datum], it's stored in the member [datum]. If +// you have an operator, its positional arguments are stored in [args] and its +// optional arguments are stored in [optargs]. +// +// A note about type signatures: +// We use the following notation to denote types: +// arg1_type, arg2_type, argrest_type... -> result_type +// So, for example, if we have a function `avg` that takes any number of +// arguments and averages them, we might write: +// NUMBER... -> NUMBER +// Or if we had a function that took one number modulo another: +// NUMBER, NUMBER -> NUMBER +// Or a function that takes a table and a primary key of any Datum type, then +// retrieves the entry with that primary key: +// Table, DATUM -> OBJECT +// Some arguments must be provided as literal values (and not the results of sub +// terms). These are marked with a `!`. +// Optional arguments are specified within curly braces as argname `:` value +// type (e.x `{noreply:BOOL}`) +// Many RQL operations are polymorphic. For these, alterantive type signatures +// are separated by `|`. +// +// The RQL type hierarchy is as follows: +// Top +// DATUM +// NULL +// BOOL +// NUMBER +// STRING +// OBJECT +// SingleSelection +// ARRAY +// Sequence +// ARRAY +// Stream +// StreamSelection +// Table +// Database +// Function +// Ordering - used only by ORDER_BY +// Pathspec -- an object, string, or array that specifies a path +// Error +message Term { + enum TermType { + // A RQL datum, stored in `datum` below. + DATUM = 1; + + MAKE_ARRAY = 2; // DATUM... -> ARRAY + // Evaluate the terms in [optargs] and make an object + MAKE_OBJ = 3; // {...} -> OBJECT + + // * Compound types + + // Takes an integer representing a variable and returns the value stored + // in that variable. It's the responsibility of the client to translate + // from their local representation of a variable to a unique _non-negative_ + // integer for that variable. (We do it this way instead of letting + // clients provide variable names as strings to discourage + // variable-capturing client libraries, and because it's more efficient + // on the wire.) + VAR = 10; // !NUMBER -> DATUM + // Takes some javascript code and executes it. + JAVASCRIPT = 11; // STRING {timeout: !NUMBER} -> DATUM | + // STRING {timeout: !NUMBER} -> Function(*) + UUID = 169; // () -> DATUM + + // Takes an HTTP URL and gets it. If the get succeeds and + // returns valid JSON, it is converted into a DATUM + HTTP = 153; // STRING {data: OBJECT | STRING, + // timeout: !NUMBER, + // method: STRING, + // params: OBJECT, + // header: OBJECT | ARRAY, + // attempts: NUMBER, + // redirects: NUMBER, + // verify: BOOL, + // page: FUNC | STRING, + // page_limit: NUMBER, + // auth: OBJECT, + // result_format: STRING, + // } -> STRING | STREAM + + // Takes a string and throws an error with that message. + // Inside of a `default` block, you can omit the first + // argument to rethrow whatever error you catch (this is most + // useful as an argument to the `default` filter optarg). + ERROR = 12; // STRING -> Error | -> Error + // Takes nothing and returns a reference to the implicit variable. + IMPLICIT_VAR = 13; // -> DATUM + + // * Data Operators + // Returns a reference to a database. + DB = 14; // STRING -> Database + // Returns a reference to a table. + TABLE = 15; // Database, STRING, {read_mode:STRING, identifier_format:STRING} -> Table + // STRING, {read_mode:STRING, identifier_format:STRING} -> Table + // Gets a single element from a table by its primary or a secondary key. + GET = 16; // Table, STRING -> SingleSelection | Table, NUMBER -> SingleSelection | + // Table, STRING -> NULL | Table, NUMBER -> NULL | + GET_ALL = 78; // Table, DATUM..., {index:!STRING} => ARRAY + + // Simple DATUM Ops + EQ = 17; // DATUM... -> BOOL + NE = 18; // DATUM... -> BOOL + LT = 19; // DATUM... -> BOOL + LE = 20; // DATUM... -> BOOL + GT = 21; // DATUM... -> BOOL + GE = 22; // DATUM... -> BOOL + NOT = 23; // BOOL -> BOOL + // ADD can either add two numbers or concatenate two arrays. + ADD = 24; // NUMBER... -> NUMBER | STRING... -> STRING + SUB = 25; // NUMBER... -> NUMBER + MUL = 26; // NUMBER... -> NUMBER + DIV = 27; // NUMBER... -> NUMBER + MOD = 28; // NUMBER, NUMBER -> NUMBER + + FLOOR = 183; // NUMBER -> NUMBER + CEIL = 184; // NUMBER -> NUMBER + ROUND = 185; // NUMBER -> NUMBER + + // DATUM Array Ops + // Append a single element to the end of an array (like `snoc`). + APPEND = 29; // ARRAY, DATUM -> ARRAY + // Prepend a single element to the end of an array (like `cons`). + PREPEND = 80; // ARRAY, DATUM -> ARRAY + //Remove the elements of one array from another array. + DIFFERENCE = 95; // ARRAY, ARRAY -> ARRAY + + // DATUM Set Ops + // Set ops work on arrays. They don't use actual sets and thus have + // performance characteristics you would expect from arrays rather than + // from sets. All set operations have the post condition that they + // array they return contains no duplicate values. + SET_INSERT = 88; // ARRAY, DATUM -> ARRAY + SET_INTERSECTION = 89; // ARRAY, ARRAY -> ARRAY + SET_UNION = 90; // ARRAY, ARRAY -> ARRAY + SET_DIFFERENCE = 91; // ARRAY, ARRAY -> ARRAY + + SLICE = 30; // Sequence, NUMBER, NUMBER -> Sequence + SKIP = 70; // Sequence, NUMBER -> Sequence + LIMIT = 71; // Sequence, NUMBER -> Sequence + OFFSETS_OF = 87; // Sequence, DATUM -> Sequence | Sequence, Function(1) -> Sequence + CONTAINS = 93; // Sequence, (DATUM | Function(1))... -> BOOL + + // Stream/Object Ops + // Get a particular field from an object, or map that over a + // sequence. + GET_FIELD = 31; // OBJECT, STRING -> DATUM + // | Sequence, STRING -> Sequence + // Return an array containing the keys of the object. + KEYS = 94; // OBJECT -> ARRAY + // Return an array containing the values of the object. + VALUES = 186; // OBJECT -> ARRAY + // Creates an object + OBJECT = 143; // STRING, DATUM, ... -> OBJECT + // Check whether an object contains all the specified fields, + // or filters a sequence so that all objects inside of it + // contain all the specified fields. + HAS_FIELDS = 32; // OBJECT, Pathspec... -> BOOL + // x.with_fields(...) <=> x.has_fields(...).pluck(...) + WITH_FIELDS = 96; // Sequence, Pathspec... -> Sequence + // Get a subset of an object by selecting some attributes to preserve, + // or map that over a sequence. (Both pick and pluck, polymorphic.) + PLUCK = 33; // Sequence, Pathspec... -> Sequence | OBJECT, Pathspec... -> OBJECT + // Get a subset of an object by selecting some attributes to discard, or + // map that over a sequence. (Both unpick and without, polymorphic.) + WITHOUT = 34; // Sequence, Pathspec... -> Sequence | OBJECT, Pathspec... -> OBJECT + // Merge objects (right-preferential) + MERGE = 35; // OBJECT... -> OBJECT | Sequence -> Sequence + + // Sequence Ops + // Get all elements of a sequence between two values. + // Half-open by default, but the openness of either side can be + // changed by passing 'closed' or 'open for `right_bound` or + // `left_bound`. + BETWEEN_DEPRECATED = 36; // Deprecated version of between, which allows `null` to specify unboundedness + // With the newer version, clients should use `r.minval` and `r.maxval` for unboundedness + BETWEEN = 182; // StreamSelection, DATUM, DATUM, {index:!STRING, right_bound:STRING, left_bound:STRING} -> StreamSelection + REDUCE = 37; // Sequence, Function(2) -> DATUM + MAP = 38; // Sequence, Function(1) -> Sequence + // The arity of the function should be + // Sequence..., Function(sizeof...(Sequence)) -> Sequence + + FOLD = 187; // Sequence, Datum, Function(2), {Function(3), Function(1) + + // Filter a sequence with either a function or a shortcut + // object (see API docs for details). The body of FILTER is + // wrapped in an implicit `.default(false)`, and you can + // change the default value by specifying the `default` + // optarg. If you make the default `r.error`, all errors + // caught by `default` will be rethrown as if the `default` + // did not exist. + FILTER = 39; // Sequence, Function(1), {default:DATUM} -> Sequence | + // Sequence, OBJECT, {default:DATUM} -> Sequence + // Map a function over a sequence and then concatenate the results together. + CONCAT_MAP = 40; // Sequence, Function(1) -> Sequence + // Order a sequence based on one or more attributes. + ORDER_BY = 41; // Sequence, (!STRING | Ordering)..., {index: (!STRING | Ordering)} -> Sequence + // Get all distinct elements of a sequence (like `uniq`). + DISTINCT = 42; // Sequence -> Sequence + // Count the number of elements in a sequence, or only the elements that match + // a given filter. + COUNT = 43; // Sequence -> NUMBER | Sequence, DATUM -> NUMBER | Sequence, Function(1) -> NUMBER + IS_EMPTY = 86; // Sequence -> BOOL + // Take the union of multiple sequences (preserves duplicate elements! (use distinct)). + UNION = 44; // Sequence... -> Sequence + // Get the Nth element of a sequence. + NTH = 45; // Sequence, NUMBER -> DATUM + // do NTH or GET_FIELD depending on target object + BRACKET = 170; // Sequence | OBJECT, NUMBER | STRING -> DATUM + // OBSOLETE_GROUPED_MAPREDUCE = 46; + // OBSOLETE_GROUPBY = 47; + + INNER_JOIN = 48; // Sequence, Sequence, Function(2) -> Sequence + OUTER_JOIN = 49; // Sequence, Sequence, Function(2) -> Sequence + // An inner-join that does an equality comparison on two attributes. + EQ_JOIN = 50; // Sequence, !STRING, Sequence, {index:!STRING} -> Sequence + ZIP = 72; // Sequence -> Sequence + RANGE = 173; // -> Sequence [0, +inf) + // NUMBER -> Sequence [0, a) + // NUMBER, NUMBER -> Sequence [a, b) + + // Array Ops + // Insert an element in to an array at a given index. + INSERT_AT = 82; // ARRAY, NUMBER, DATUM -> ARRAY + // Remove an element at a given index from an array. + DELETE_AT = 83; // ARRAY, NUMBER -> ARRAY | + // ARRAY, NUMBER, NUMBER -> ARRAY + // Change the element at a given index of an array. + CHANGE_AT = 84; // ARRAY, NUMBER, DATUM -> ARRAY + // Splice one array in to another array. + SPLICE_AT = 85; // ARRAY, NUMBER, ARRAY -> ARRAY + + // * Type Ops + // Coerces a datum to a named type (e.g. "bool"). + // If you previously used `stream_to_array`, you should use this instead + // with the type "array". + COERCE_TO = 51; // Top, STRING -> Top + // Returns the named type of a datum (e.g. TYPE_OF(true) = "BOOL") + TYPE_OF = 52; // Top -> STRING + + // * Write Ops (the OBJECTs contain data about number of errors etc.) + // Updates all the rows in a selection. Calls its Function with the row + // to be updated, and then merges the result of that call. + UPDATE = 53; // StreamSelection, Function(1), {non_atomic:BOOL, durability:STRING, return_changes:BOOL} -> OBJECT | + // SingleSelection, Function(1), {non_atomic:BOOL, durability:STRING, return_changes:BOOL} -> OBJECT | + // StreamSelection, OBJECT, {non_atomic:BOOL, durability:STRING, return_changes:BOOL} -> OBJECT | + // SingleSelection, OBJECT, {non_atomic:BOOL, durability:STRING, return_changes:BOOL} -> OBJECT + // Deletes all the rows in a selection. + DELETE = 54; // StreamSelection, {durability:STRING, return_changes:BOOL} -> OBJECT | SingleSelection -> OBJECT + // Replaces all the rows in a selection. Calls its Function with the row + // to be replaced, and then discards it and stores the result of that + // call. + REPLACE = 55; // StreamSelection, Function(1), {non_atomic:BOOL, durability:STRING, return_changes:BOOL} -> OBJECT | SingleSelection, Function(1), {non_atomic:BOOL, durability:STRING, return_changes:BOOL} -> OBJECT + // Inserts into a table. If `conflict` is replace, overwrites + // entries with the same primary key. If `conflict` is + // update, does an update on the entry. If `conflict` is + // error, or is omitted, conflicts will trigger an error. + INSERT = 56; // Table, OBJECT, {conflict:STRING, durability:STRING, return_changes:BOOL} -> OBJECT | Table, Sequence, {conflict:STRING, durability:STRING, return_changes:BOOL} -> OBJECT + + // * Administrative OPs + // Creates a database with a particular name. + DB_CREATE = 57; // STRING -> OBJECT + // Drops a database with a particular name. + DB_DROP = 58; // STRING -> OBJECT + // Lists all the databases by name. (Takes no arguments) + DB_LIST = 59; // -> ARRAY + // Creates a table with a particular name in a particular + // database. (You may omit the first argument to use the + // default database.) + TABLE_CREATE = 60; // Database, STRING, {primary_key:STRING, shards:NUMBER, replicas:NUMBER, primary_replica_tag:STRING} -> OBJECT + // Database, STRING, {primary_key:STRING, shards:NUMBER, replicas:OBJECT, primary_replica_tag:STRING} -> OBJECT + // STRING, {primary_key:STRING, shards:NUMBER, replicas:NUMBER, primary_replica_tag:STRING} -> OBJECT + // STRING, {primary_key:STRING, shards:NUMBER, replicas:OBJECT, primary_replica_tag:STRING} -> OBJECT + // Drops a table with a particular name from a particular + // database. (You may omit the first argument to use the + // default database.) + TABLE_DROP = 61; // Database, STRING -> OBJECT + // STRING -> OBJECT + // Lists all the tables in a particular database. (You may + // omit the first argument to use the default database.) + TABLE_LIST = 62; // Database -> ARRAY + // -> ARRAY + // Returns the row in the `rethinkdb.table_config` or `rethinkdb.db_config` table + // that corresponds to the given database or table. + CONFIG = 174; // Database -> SingleSelection + // Table -> SingleSelection + // Returns the row in the `rethinkdb.table_status` table that corresponds to the + // given table. + STATUS = 175; // Table -> SingleSelection + // Called on a table, waits for that table to be ready for read/write operations. + // Called on a database, waits for all of the tables in the database to be ready. + // Returns the corresponding row or rows from the `rethinkdb.table_status` table. + WAIT = 177; // Table -> OBJECT + // Database -> OBJECT + // Generates a new config for the given table, or all tables in the given database + // The `shards` and `replicas` arguments are required. If `emergency_repair` is + // specified, it will enter a completely different mode of repairing a table + // which has lost half or more of its replicas. + RECONFIGURE = 176; // Database|Table, {shards:NUMBER, replicas:NUMBER [, + // dry_run:BOOLEAN] + // } -> OBJECT + // Database|Table, {shards:NUMBER, replicas:OBJECT [, + // primary_replica_tag:STRING, + // nonvoting_replica_tags:ARRAY, + // dry_run:BOOLEAN] + // } -> OBJECT + // Table, {emergency_repair:STRING, dry_run:BOOLEAN} -> OBJECT + // Balances the table's shards but leaves everything else the same. Can also be + // applied to an entire database at once. + REBALANCE = 179; // Table -> OBJECT + // Database -> OBJECT + + // Ensures that previously issued soft-durability writes are complete and + // written to disk. + SYNC = 138; // Table -> OBJECT + + // Set global, database, or table-specific permissions + GRANT = 188; // -> OBJECT + // Database -> OBJECT + // Table -> OBJECT + + // * Secondary indexes OPs + // Creates a new secondary index with a particular name and definition. + INDEX_CREATE = 75; // Table, STRING, Function(1), {multi:BOOL} -> OBJECT + // Drops a secondary index with a particular name from the specified table. + INDEX_DROP = 76; // Table, STRING -> OBJECT + // Lists all secondary indexes on a particular table. + INDEX_LIST = 77; // Table -> ARRAY + // Gets information about whether or not a set of indexes are ready to + // be accessed. Returns a list of objects that look like this: + // {index:STRING, ready:BOOL[, progress:NUMBER]} + INDEX_STATUS = 139; // Table, STRING... -> ARRAY + // Blocks until a set of indexes are ready to be accessed. Returns the + // same values INDEX_STATUS. + INDEX_WAIT = 140; // Table, STRING... -> ARRAY + // Renames the given index to a new name + INDEX_RENAME = 156; // Table, STRING, STRING, {overwrite:BOOL} -> OBJECT + + // * Write hook Function OPs + // Creates a new write hook function with a particular definition + SET_WRITE_HOOK = 189; // Table, Function(2) + // Gets an existing write hook function on a table + GET_WRITE_HOOK = 190; // Table + + + + // * Control Operators + // Calls a function on data + FUNCALL = 64; // Function(*), DATUM... -> DATUM + // Executes its first argument, and returns its second argument if it + // got [true] or its third argument if it got [false] (like an `if` + // statement). + BRANCH = 65; // BOOL, Top, Top -> Top + // Returns true if any of its arguments returns true (short-circuits). + OR = 66; // BOOL... -> BOOL + // Returns true if all of its arguments return true (short-circuits). + AND = 67; // BOOL... -> BOOL + // Calls its Function with each entry in the sequence + // and executes the array of terms that Function returns. + FOR_EACH = 68; // Sequence, Function(1) -> OBJECT + +//////////////////////////////////////////////////////////////////////////////// +////////// Special Terms +//////////////////////////////////////////////////////////////////////////////// + + // An anonymous function. Takes an array of numbers representing + // variables (see [VAR] above), and a [Term] to execute with those in + // scope. Returns a function that may be passed an array of arguments, + // then executes the Term with those bound to the variable names. The + // user will never construct this directly. We use it internally for + // things like `map` which take a function. The "arity" of a [Function] is + // the number of arguments it takes. + // For example, here's what `_X_.map{|x| x+2}` turns into: + // Term { + // type = MAP; + // args = [_X_, + // Term { + // type = Function; + // args = [Term { + // type = DATUM; + // datum = Datum { + // type = R_ARRAY; + // r_array = [Datum { type = R_NUM; r_num = 1; }]; + // }; + // }, + // Term { + // type = ADD; + // args = [Term { + // type = VAR; + // args = [Term { + // type = DATUM; + // datum = Datum { type = R_NUM; + // r_num = 1}; + // }]; + // }, + // Term { + // type = DATUM; + // datum = Datum { type = R_NUM; r_num = 2; }; + // }]; + // }]; + // }]; + FUNC = 69; // ARRAY, Top -> ARRAY -> Top + + // Indicates to ORDER_BY that this attribute is to be sorted in ascending order. + ASC = 73; // !STRING -> Ordering + // Indicates to ORDER_BY that this attribute is to be sorted in descending order. + DESC = 74; // !STRING -> Ordering + + // Gets info about anything. INFO is most commonly called on tables. + INFO = 79; // Top -> OBJECT + + // `a.match(b)` returns a match object if the string `a` + // matches the regular expression `b`. + MATCH = 97; // STRING, STRING -> DATUM + + // Change the case of a string. + UPCASE = 141; // STRING -> STRING + DOWNCASE = 142; // STRING -> STRING + + // Select a number of elements from sequence with uniform distribution. + SAMPLE = 81; // Sequence, NUMBER -> Sequence + + // Evaluates its first argument. If that argument returns + // NULL or throws an error related to the absence of an + // expected value (for instance, accessing a non-existent + // field or adding NULL to an integer), DEFAULT will either + // return its second argument or execute it if it's a + // function. If the second argument is a function, it will be + // passed either the text of the error or NULL as its + // argument. + DEFAULT = 92; // Top, Top -> Top + + // Parses its first argument as a json string and returns it as a + // datum. + JSON = 98; // STRING -> DATUM + + // Parses its first arguments as an ISO 8601 time and returns it as a + // datum. + ISO8601 = 99; // STRING -> PSEUDOTYPE(TIME) + // Prints a time as an ISO 8601 time. + TO_ISO8601 = 100; // PSEUDOTYPE(TIME) -> STRING + + // Returns a time given seconds since epoch in UTC. + EPOCH_TIME = 101; // NUMBER -> PSEUDOTYPE(TIME) + // Returns seconds since epoch in UTC given a time. + TO_EPOCH_TIME = 102; // PSEUDOTYPE(TIME) -> NUMBER + + // The time the query was received by the server. + NOW = 103; // -> PSEUDOTYPE(TIME) + // Puts a time into an ISO 8601 timezone. + IN_TIMEZONE = 104; // PSEUDOTYPE(TIME), STRING -> PSEUDOTYPE(TIME) + // a.during(b, c) returns whether a is in the range [b, c) + DURING = 105; // PSEUDOTYPE(TIME), PSEUDOTYPE(TIME), PSEUDOTYPE(TIME) -> BOOL + // Retrieves the date portion of a time. + DATE = 106; // PSEUDOTYPE(TIME) -> PSEUDOTYPE(TIME) + // x.time_of_day == x.date - x + TIME_OF_DAY = 126; // PSEUDOTYPE(TIME) -> NUMBER + // Returns the timezone of a time. + TIMEZONE = 127; // PSEUDOTYPE(TIME) -> STRING + + // These access the various components of a time. + YEAR = 128; // PSEUDOTYPE(TIME) -> NUMBER + MONTH = 129; // PSEUDOTYPE(TIME) -> NUMBER + DAY = 130; // PSEUDOTYPE(TIME) -> NUMBER + DAY_OF_WEEK = 131; // PSEUDOTYPE(TIME) -> NUMBER + DAY_OF_YEAR = 132; // PSEUDOTYPE(TIME) -> NUMBER + HOURS = 133; // PSEUDOTYPE(TIME) -> NUMBER + MINUTES = 134; // PSEUDOTYPE(TIME) -> NUMBER + SECONDS = 135; // PSEUDOTYPE(TIME) -> NUMBER + + // Construct a time from a date and optional timezone or a + // date+time and optional timezone. + TIME = 136; // NUMBER, NUMBER, NUMBER, STRING -> PSEUDOTYPE(TIME) | + // NUMBER, NUMBER, NUMBER, NUMBER, NUMBER, NUMBER, STRING -> PSEUDOTYPE(TIME) | + + // Constants for ISO 8601 days of the week. + MONDAY = 107; // -> 1 + TUESDAY = 108; // -> 2 + WEDNESDAY = 109; // -> 3 + THURSDAY = 110; // -> 4 + FRIDAY = 111; // -> 5 + SATURDAY = 112; // -> 6 + SUNDAY = 113; // -> 7 + + // Constants for ISO 8601 months. + JANUARY = 114; // -> 1 + FEBRUARY = 115; // -> 2 + MARCH = 116; // -> 3 + APRIL = 117; // -> 4 + MAY = 118; // -> 5 + JUNE = 119; // -> 6 + JULY = 120; // -> 7 + AUGUST = 121; // -> 8 + SEPTEMBER = 122; // -> 9 + OCTOBER = 123; // -> 10 + NOVEMBER = 124; // -> 11 + DECEMBER = 125; // -> 12 + + // Indicates to MERGE to replace, or remove in case of an empty literal, the + // other object rather than merge it. + LITERAL = 137; // -> Merging + // JSON -> Merging + + // SEQUENCE, STRING -> GROUPED_SEQUENCE | SEQUENCE, FUNCTION -> GROUPED_SEQUENCE + GROUP = 144; + SUM = 145; + AVG = 146; + MIN = 147; + MAX = 148; + + // `str.split()` splits on whitespace + // `str.split(" ")` splits on spaces only + // `str.split(" ", 5)` splits on spaces with at most 5 results + // `str.split(nil, 5)` splits on whitespace with at most 5 results + SPLIT = 149; // STRING -> ARRAY | STRING, STRING -> ARRAY | STRING, STRING, NUMBER -> ARRAY | STRING, NULL, NUMBER -> ARRAY + + UNGROUP = 150; // GROUPED_DATA -> ARRAY + + // Takes a range of numbers and returns a random number within the range + RANDOM = 151; // NUMBER, NUMBER {float:BOOL} -> DATUM + + CHANGES = 152; // TABLE -> STREAM + ARGS = 154; // ARRAY -> SPECIAL (used to splice arguments) + + // BINARY is client-only at the moment, it is not supported on the server + BINARY = 155; // STRING -> PSEUDOTYPE(BINARY) + + GEOJSON = 157; // OBJECT -> PSEUDOTYPE(GEOMETRY) + TO_GEOJSON = 158; // PSEUDOTYPE(GEOMETRY) -> OBJECT + POINT = 159; // NUMBER, NUMBER -> PSEUDOTYPE(GEOMETRY) + LINE = 160; // (ARRAY | PSEUDOTYPE(GEOMETRY))... -> PSEUDOTYPE(GEOMETRY) + POLYGON = 161; // (ARRAY | PSEUDOTYPE(GEOMETRY))... -> PSEUDOTYPE(GEOMETRY) + DISTANCE = 162; // PSEUDOTYPE(GEOMETRY), PSEUDOTYPE(GEOMETRY) {geo_system:STRING, unit:STRING} -> NUMBER + INTERSECTS = 163; // PSEUDOTYPE(GEOMETRY), PSEUDOTYPE(GEOMETRY) -> BOOL + INCLUDES = 164; // PSEUDOTYPE(GEOMETRY), PSEUDOTYPE(GEOMETRY) -> BOOL + CIRCLE = 165; // PSEUDOTYPE(GEOMETRY), NUMBER {num_vertices:NUMBER, geo_system:STRING, unit:STRING, fill:BOOL} -> PSEUDOTYPE(GEOMETRY) + GET_INTERSECTING = 166; // TABLE, PSEUDOTYPE(GEOMETRY) {index:!STRING} -> StreamSelection + FILL = 167; // PSEUDOTYPE(GEOMETRY) -> PSEUDOTYPE(GEOMETRY) + GET_NEAREST = 168; // TABLE, PSEUDOTYPE(GEOMETRY) {index:!STRING, max_results:NUM, max_dist:NUM, geo_system:STRING, unit:STRING} -> ARRAY + POLYGON_SUB = 171; // PSEUDOTYPE(GEOMETRY), PSEUDOTYPE(GEOMETRY) -> PSEUDOTYPE(GEOMETRY) + + // Returns the datum as a JSON string. + // N.B.: we would really prefer this be named TO_JSON and that exists as + // an alias in Python and JavaScript drivers; however it conflicts with the + // standard `to_json` method defined by Ruby's standard json library. + TO_JSON_STRING = 172; // DATUM -> STRING + + // Constants for specifying key ranges + MINVAL = 180; + MAXVAL = 181; + + // Bitwise operations + BIT_AND = 191; + BIT_OR = 192; + BIT_XOR = 193; + BIT_NOT = 194; + BIT_SAL = 195; + BIT_SAR = 196; + } + optional TermType type = 1; + + // This is only used when type is DATUM. + optional Datum datum = 2; + + repeated Term args = 3; // Holds the positional arguments of the query. + message AssocPair { + optional string key = 1; + optional Term val = 2; + } + repeated AssocPair optargs = 4; // Holds the optional arguments of the query. + // (Note that the order of the optional arguments doesn't matter; think of a + // Hash.) +} + +//////////////////////////////////////////////////////////////////////////////// +// EXAMPLE // +//////////////////////////////////////////////////////////////////////////////// +// ```ruby +// r.table('tbl', {:read_mode => 'outdated'}).insert([{:id => 0}, {:id => 1}]) +// ``` +// Would turn into: +// Term { +// type = INSERT; +// args = [Term { +// type = TABLE; +// args = [Term { +// type = DATUM; +// datum = Datum { type = R_STR; r_str = "tbl"; }; +// }]; +// optargs = [["read_mode", +// Term { +// type = DATUM; +// datum = Datum { type = R_STR; r_bool = "outdated"; }; +// }]]; +// }, +// Term { +// type = MAKE_ARRAY; +// args = [Term { +// type = DATUM; +// datum = Datum { type = R_OBJECT; r_object = [["id", 0]]; }; +// }, +// Term { +// type = DATUM; +// datum = Datum { type = R_OBJECT; r_object = [["id", 1]]; }; +// }]; +// }] +// } +// And the server would reply: +// Response { +// type = SUCCESS_ATOM; +// token = 1; +// response = [Datum { type = R_OBJECT; r_object = [["inserted", 2]]; }]; +// } +// Or, if there were an error: +// Response { +// type = RUNTIME_ERROR; +// token = 1; +// response = [Datum { type = R_STR; r_str = "The table `tbl` doesn't exist!"; }]; +// backtrace = [Frame { type = POS; pos = 0; }, Frame { type = POS; pos = 0; }]; +// } diff --git a/drivers/rethinkdb/lib/src/generated/regenerate-proto.sh b/drivers/rethinkdb/lib/src/generated/regenerate-proto.sh new file mode 100755 index 0000000..b1fdc87 --- /dev/null +++ b/drivers/rethinkdb/lib/src/generated/regenerate-proto.sh @@ -0,0 +1,33 @@ +#!/bin/sh + +DIR="$(dirname $BASH_SOURCE)" + +# Do a quick validation that we have the protoc plugin in PATH. +PROTOC_PLUGIN=$(which protoc-gen-dart) +if [ ! -f "$PROTOC_PLUGIN" ]; then + echo -en "Could not find Dart plugin for protoc! \nMake sure \$PATH includes " + echo "the protoc compiler plugin for Dart (named \"protoc-gen-dart\")!" + exit 1 +fi + +function run { + echo "Running $@" + $@ + + EXITCODE=$? + if [ $EXITCODE -ne 0 ]; then + echo " -> Command failed with exitcode $EXITCODE. Aborting ..." + exit $EXITCODE + fi +} + + +# Retrieve updated protobuf for rethinkdb's wire protocol. +# See https://www.rethinkdb.com/docs/writing-drivers/ for more detail. +run wget -O $DIR/ql2.proto https://raw.githubusercontent.com/rethinkdb/rethinkdb/next/src/rdb_protocol/ql2.proto + +# Re-generate protobuf files. +# See https://developers.google.com/protocol-buffers/docs/reference/dart-generated +run protoc --proto_path=$DIR --dart_out=$DIR $DIR/ql2.proto + +run dart format --fix $DIR \ No newline at end of file diff --git a/drivers/rethinkdb/lib/src/net.dart b/drivers/rethinkdb/lib/src/net.dart new file mode 100644 index 0000000..7abb010 --- /dev/null +++ b/drivers/rethinkdb/lib/src/net.dart @@ -0,0 +1,534 @@ +part of '../platform_driver_rethinkdb.dart'; + +class Query extends RqlQuery { + final p.Query_QueryType _type; + final int _token; + final RqlQuery? _term; + late final Map? _globalOptargs; + Cursor? _cursor; + final Completer _queryCompleter = Completer(); + + Query(this._type, this._token, [this._term, this._globalOptargs]); + + serialize() { + List res = [_type.value]; + if (_term != null) { + res.add(_term.build()); + } + if (_globalOptargs != null) { + Map optargs = {}; + _globalOptargs.forEach((k, v) { + optargs[k] = v is RqlQuery ? v.build() : v; + }); + + res.add(optargs); + } + return json.encode(res); + } +} + +class Response { + final int _token; + late int _type; + dynamic _data; + dynamic _backtrace; + dynamic _profile; + late int? _errorType; + List? _notes = []; + + Response(this._token, String jsonStr) { + if (jsonStr.isNotEmpty) { + Map fullResponse = json.decode(jsonStr); + _type = fullResponse['t']; + _data = fullResponse['r']; + _backtrace = fullResponse['b']; + _profile = fullResponse['p']; + _notes = fullResponse['n']; + _errorType = fullResponse['e']; + } + } +} + +class Connection { + Socket? _socket; + static int _nextToken = 0; + final String _host; + final int _port; + String _db; + final String _user; + final String _password; + final int _protocolVersion = 0; + late String _clientFirstMessage; + late Digest _serverSignature; + late final Map? _sslOpts; + + final Completer _completer = Completer(); + + int _responseLength = 0; + final List _responseBuffer = []; + + final Map _replyQueries = {}; + final Queue _sendQueue = Queue(); + + final Map _listeners = {}; + + Connection( + this._db, + this._host, + this._port, + this._user, + this._password, + this._sslOpts, + ); + + // ignore: unnecessary_null_comparison + get isClosed => _socket == null; + + void use(String db) { + _db = db; + } + + Future server() { + // RqlQuery query = + RqlQuery query = + Query(p.Query_QueryType.SERVER_INFO, _getToken(), null, null); + _sendQueue.add(query); + return _start(query); + } + + Future connect([bool noreplyWait = true]) { + return (reconnect(noreplyWait)); + } + + Future reconnect([bool noreplyWait = true]) { + close(noreplyWait); + + if (_listeners["connect"] != null) { + for (var func in _listeners["connect"]!) { + func(); + } + } + var sock = Socket.connect(_host, _port); + + if (_sslOpts != null && _sslOpts.containsKey('ca')) { + SecurityContext context = SecurityContext() + ..setTrustedCertificates(_sslOpts['ca']); + sock = SecureSocket.connect(_host, _port, context: context); + } + + sock.then((socket) { + // ignore: unnecessary_null_comparison + if (socket != null) { + _socket = socket; + _socket!.listen(_handleResponse, onDone: () { + if (_listeners["close"] != null) { + for (var func in _listeners["close"]!) { + func(); + } + } + }); + + _clientFirstMessage = "n=$_user,r=${_makeSalt()}"; + String message = json.encode({ + 'protocol_version': _protocolVersion, + 'authentication_method': "SCRAM-SHA-256", + // ignore: unnecessary_brace_in_string_interps + 'authentication': "n,,${_clientFirstMessage}" + }); + List handshake = + List.from(_toBytes(p.VersionDummy_Version.V1_0.value)) + ..addAll(message.codeUnits) + ..add(0); + + _socket!.add(handshake); + } + }).catchError((err) { + _completer.completeError( + RqlDriverError("Could not connect to $_host:$_port. Error $err"), + ); + }); + + return _completer.future; + } + + _handleResponse(List bytes) { + if (!_completer.isCompleted) { + _handleAuthResponse(bytes); + } else { + _readResponse(bytes); + } + } + + _handleAuthResponse(List res) { + List response = []; + for (final byte in res) { + if (byte == 0) { + _doHandshake(response); + response.clear(); + } else { + response.add(byte); + } + } + } + + _handleAuthError(Exception error) { + if (_listeners["error"] != null) { + for (var func in _listeners["error"]!) { + func(error); + } + } + _completer.completeError(error); + } + + _doHandshake(List response) { + Map responseJSON = json.decode(utf8.decode(response)); + + if (responseJSON.containsKey('success') && responseJSON['success']) { + if (responseJSON.containsKey('max_protocol_version')) { + int max = responseJSON['max_protocol_version']; + int min = responseJSON['min_protocol_version']; + if (min > _protocolVersion || max < _protocolVersion) { + //We don't actually support changing the protocol yet, so just error. + _handleAuthError( + RqlDriverError("""Unsupported protocol version $_protocolVersion, + expected between $min and $max.""")); + } + } else if (responseJSON.containsKey('authentication')) { + String authString = responseJSON['authentication']; + Map authMap = {}; + List authPieces = authString.split(','); + + for (var piece in authPieces) { + int i = piece.indexOf('='); + String key = piece.substring(0, i); + String val = piece.substring(i + 1); + authMap[key] = val; + } + + if (authMap.containsKey('r')) { + String salt = String.fromCharCodes(base64.decode(authMap['s'])); + + int i = int.parse(authMap['i']); + String clientFinalMessageWithoutProof = "c=biws,r=${authMap['r']}"; + + //PBKDF2NS gen = PBKDF2NS(hash: sha256); + //List saltedPassword = gen.generateKey(_password, salt, i, 32); + var saltedPassword = + hashlib.pbkdf2(_password.codeUnits, salt.codeUnits, i, 32); + + Digest clientKey = Hmac(sha256, saltedPassword.bytes) + .convert("Client Key".codeUnits); + Digest storedKey = sha256.convert(clientKey.bytes); + + String authMessage = + "$_clientFirstMessage,$authString,$clientFinalMessageWithoutProof"; + + Digest clientSignature = + Hmac(sha256, storedKey.bytes).convert(authMessage.codeUnits); + + List clientProof = _xOr(clientKey.bytes, clientSignature.bytes); + + Digest serverKey = Hmac(sha256, saltedPassword.bytes) + .convert("Server Key".codeUnits); + + _serverSignature = + Hmac(sha256, serverKey.bytes).convert(authMessage.codeUnits); + + String message = json.encode({ + 'authentication': + "$clientFinalMessageWithoutProof,p=${base64.encode(clientProof)}" + }); + + List messageBytes = List.from(message.codeUnits)..add(0); + + _socket!.add(messageBytes); + } else if (authMap.containsKey('v')) { + if (base64.encode(_serverSignature.bytes) != authMap['v']) { + _handleAuthError(RqlDriverError("Invalid server signature")); + } else { + _completer.complete(this); + } + } + } + } else { + _handleAuthError(RqlDriverError( + "Server dropped connection with message: ${responseJSON['error']}")); + } + } + + _handleQueryResponse(Response response) { + Query query = _replyQueries.remove(response._token); + + Exception? hasError = _checkErrorResponse(response, query._term); + // ignore: unnecessary_null_comparison + if (hasError != null) { + query._queryCompleter.completeError(hasError); + } + dynamic value; + + if (response._type == p.Response_ResponseType.SUCCESS_PARTIAL.value) { + _replyQueries[response._token] = query; + dynamic cursor; + for (var note in response._notes!) { + if (note == p.Response_ResponseNote.SEQUENCE_FEED.value) { + cursor = cursor ?? Feed(this, query, query.optargs); + } else if (note == p.Response_ResponseNote.UNIONED_FEED.value) { + cursor = cursor ?? UnionedFeed(this, query, query.optargs); + } else if (note == p.Response_ResponseNote.ATOM_FEED.value) { + cursor = cursor ?? AtomFeed(this, query, query.optargs); + } else if (note == p.Response_ResponseNote.ORDER_BY_LIMIT_FEED.value) { + cursor = cursor ?? OrderByLimitFeed(this, query, query.optargs); + } + } + cursor = cursor ?? Cursor(this, query, query.optargs); + + value = cursor; + query._cursor = value; + value._extend(response); + } else if (response._type == + p.Response_ResponseType.SUCCESS_SEQUENCE.value) { + value = Cursor(this, query, {}); + query._cursor = value; + value._extend(response); + } else if (response._type == p.Response_ResponseType.SUCCESS_ATOM.value) { + if (response._data.length < 1) { + value = null; + } + value = query._recursivelyConvertPseudotypes(response._data.first, null); + } else if (response._type == p.Response_ResponseType.WAIT_COMPLETE.value) { + //Noreply_wait response + value = null; + } else if (response._type == p.Response_ResponseType.SERVER_INFO.value) { + query._queryCompleter.complete(response._data.first); + } else { + if (!query._queryCompleter.isCompleted) { + query._queryCompleter + .completeError(RqlDriverError("Error: ${response._data}.")); + } + } + + if (response._profile != null) { + value = {"value": value, "profile": response._profile}; + } + if (!query._queryCompleter.isCompleted) { + query._queryCompleter.complete(value); + } + } + + void close([bool noreplyWait = true]) { + // ignore: unnecessary_null_comparison + if (_socket != null) { + if (noreplyWait) this.noreplyWait(); + try { + _socket!.close(); + } catch (err) { + // TODO: do something with err. + } + + _socket!.destroy(); + _socket = null; + } + } + + /// Alias for addListener + void on(String key, Function val) { + addListener(key, val); + } + + /// Adds a listener to the connection. + void addListener(String key, Function val) { + List currentListeners = []; + // ignore: unnecessary_null_comparison + if (_listeners != null && _listeners[key] != null) { + for (var element in _listeners[key]!) { + currentListeners.add(element); + } + } + + currentListeners.add(val); + _listeners[key] = currentListeners; + } + + _getToken() { + return ++_nextToken; + } + + clientPort() { + return _socket!.port; + } + + clientAddress() { + return _socket!.address.address; + } + + noreplyWait() { + // RqlQuery query = + RqlQuery query = + Query(p.Query_QueryType.NOREPLY_WAIT, _getToken(), null, null); + + _sendQueue.add(query); + return _start(query); + } + + _handleCursorResponse(Response response) { + Cursor cursor = _replyQueries[response._token]._cursor; + cursor._extend(response); + cursor._outstandingRequests--; + + if (response._type != p.Response_ResponseType.SUCCESS_PARTIAL.value && + cursor._outstandingRequests == 0) { + _replyQueries[response._token]._cursor = null; + } + } + + _readResponse(res) { + int responseToken; + String responseBuf; + int responseLen; + + _responseBuffer.addAll(List.from(res)); + + _responseLength = _responseBuffer.length; + + if (_responseLength >= 12) { + responseToken = _fromBytes(_responseBuffer.sublist(0, 8)); + responseLen = _fromBytes(_responseBuffer.sublist(8, 12)); + if (_responseLength >= responseLen + 12) { + responseBuf = + utf8.decode(_responseBuffer.sublist(12, responseLen + 12)); + + _responseBuffer.removeRange(0, responseLen + 12); + _responseLength = _responseBuffer.length; + + Response response = Response(responseToken, responseBuf); + + if (_replyQueries[response._token]._cursor != null) { + _handleCursorResponse(response); + } + //if for some reason there are other queries on the line... + + if (_replyQueries.containsKey(response._token)) { + _handleQueryResponse(response); + } else { + throw RqlDriverError("Unexpected response received."); + } + + if (_responseLength > 0) { + _readResponse([]); + } + } + } + } + + _checkErrorResponse(Response response, RqlQuery? term) { + dynamic message; + dynamic frames; + if (response._type == p.Response_ResponseType.RUNTIME_ERROR.value) { + message = response._data.first; + frames = response._backtrace; + int? errType = response._errorType; + if (errType == p.Response_ErrorType.INTERNAL.value) { + return ReqlInternalError(message, term, frames); + } else if (errType == p.Response_ErrorType.RESOURCE_LIMIT.value) { + return ReqlResourceLimitError(message, term, frames); + } else if (errType == p.Response_ErrorType.QUERY_LOGIC.value) { + return ReqlQueryLogicError(message, term, frames); + } else if (errType == p.Response_ErrorType.NON_EXISTENCE.value) { + return ReqlNonExistenceError(message, term, frames); + } else if (errType == p.Response_ErrorType.OP_FAILED.value) { + return ReqlOpFailedError(message, term, frames); + } else if (errType == p.Response_ErrorType.OP_INDETERMINATE.value) { + return ReqlOpIndeterminateError(message, term, frames); + } else if (errType == p.Response_ErrorType.USER.value) { + return ReqlUserError(message, term, frames); + } else if (errType == p.Response_ErrorType.PERMISSION_ERROR.value) { + return ReqlPermissionError(message, term, frames); + } else { + return RqlRuntimeError(message, term, frames); + } + } else if (response._type == p.Response_ResponseType.COMPILE_ERROR.value) { + message = response._data.first; + frames = response._backtrace; + return RqlCompileError(message, term, frames); + } else if (response._type == p.Response_ResponseType.CLIENT_ERROR.value) { + message = response._data.first; + frames = response._backtrace; + return RqlClientError(message, term, frames); + } + return null; + } + + _sendQuery() { + if (_sendQueue.isNotEmpty) { + Query query = _sendQueue.removeFirst(); + + // Error if this connection has closed + if (_socket == null) { + query._queryCompleter + .completeError(RqlDriverError("Connection is closed.")); + } else { + // Send json + List queryStr = utf8.encode(query.serialize()); + List queryHeader = List.from(_toBytes8(query._token)) + ..addAll(_toBytes(queryStr.length)) + ..addAll(queryStr); + _socket!.add(queryHeader); + + _replyQueries[query._token] = query; + return query._queryCompleter.future; + } + } + } + + _start(RqlQuery term, [Map? globalOptargs]) { + globalOptargs ??= {}; + if (globalOptargs.containsKey('db')) { + globalOptargs['db'] = DB(globalOptargs['db']); + } else { + globalOptargs['db'] = DB(_db); + } + + Query query = + Query(p.Query_QueryType.START, _getToken(), term, globalOptargs); + _sendQueue.addLast(query); + return _sendQuery(); + } + + Uint8List _toBytes(int data) { + ByteBuffer buffer = Uint8List(4).buffer; + ByteData bdata = ByteData.view(buffer); + bdata.setInt32(0, data, Endian.little); + return Uint8List.view(buffer); + } + + Uint8List _toBytes8(int data) { + ByteBuffer buffer = Uint8List(8).buffer; + ByteData bdata = ByteData.view(buffer); + bdata.setInt32(0, data, Endian.little); + return Uint8List.view(buffer); + } + + int _fromBytes(List data) { + Uint8List buf = Uint8List.fromList(data); + ByteData bdata = ByteData.view(buf.buffer); + return bdata.getInt32(0, Endian.little); + } + + String _makeSalt() { + List randomBytes = []; + math.Random random = math.Random.secure(); + + for (int i = 0; i < randomBytes.length; ++i) { + randomBytes[i] = random.nextInt(255); + } + + return base64.encode(randomBytes); + } + + List _xOr(List result, List next) { + for (int i = 0; i < result.length; i++) { + result[i] ^= next[i]; + } + return result; + } +} diff --git a/drivers/rethinkdb/pubspec.lock b/drivers/rethinkdb/pubspec.lock new file mode 100644 index 0000000..e0f8d5d --- /dev/null +++ b/drivers/rethinkdb/pubspec.lock @@ -0,0 +1,434 @@ +# Generated by pub +# See https://dart.dev/tools/pub/glossary#lockfile +packages: + _fe_analyzer_shared: + dependency: transitive + description: + name: _fe_analyzer_shared + sha256: "45cfa8471b89fb6643fe9bf51bd7931a76b8f5ec2d65de4fb176dba8d4f22c77" + url: "https://pub.dev" + source: hosted + version: "73.0.0" + _macros: + dependency: transitive + description: dart + source: sdk + version: "0.3.2" + analyzer: + dependency: transitive + description: + name: analyzer + sha256: "4959fec185fe70cce007c57e9ab6983101dbe593d2bf8bbfb4453aaec0cf470a" + url: "https://pub.dev" + source: hosted + version: "6.8.0" + args: + dependency: transitive + description: + name: args + sha256: bf9f5caeea8d8fe6721a9c358dd8a5c1947b27f1cfaa18b39c301273594919e6 + url: "https://pub.dev" + source: hosted + version: "2.6.0" + async: + dependency: transitive + description: + name: async + sha256: d2872f9c19731c2e5f10444b14686eb7cc85c76274bd6c16e1816bff9a3bab63 + url: "https://pub.dev" + source: hosted + version: "2.12.0" + boolean_selector: + dependency: transitive + description: + name: boolean_selector + sha256: "8aab1771e1243a5063b8b0ff68042d67334e3feab9e95b9490f9a6ebf73b42ea" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + collection: + dependency: transitive + description: + name: collection + sha256: "2f5709ae4d3d59dd8f7cd309b4e023046b57d8a6c82130785d2b0e5868084e76" + url: "https://pub.dev" + source: hosted + version: "1.19.1" + convert: + dependency: "direct main" + description: + name: convert + sha256: b30acd5944035672bc15c6b7a8b47d773e41e2f17de064350988c5d02adb1c68 + url: "https://pub.dev" + source: hosted + version: "3.1.2" + coverage: + dependency: transitive + description: + name: coverage + sha256: "88b0fddbe4c92910fefc09cc0248f5e7f0cd23e450ded4c28f16ab8ee8f83268" + url: "https://pub.dev" + source: hosted + version: "1.10.0" + crypto: + dependency: "direct main" + description: + name: crypto + sha256: "1e445881f28f22d6140f181e07737b22f1e099a5e1ff94b0af2f9e4a463f4855" + url: "https://pub.dev" + source: hosted + version: "3.0.6" + file: + dependency: transitive + description: + name: file + sha256: a3b4f84adafef897088c160faf7dfffb7696046cb13ae90b508c2cbc95d3b8d4 + url: "https://pub.dev" + source: hosted + version: "7.0.1" + fixnum: + dependency: "direct main" + description: + name: fixnum + sha256: b6dc7065e46c974bc7c5f143080a6764ec7a4be6da1285ececdc37be96de53be + url: "https://pub.dev" + source: hosted + version: "1.1.1" + frontend_server_client: + dependency: transitive + description: + name: frontend_server_client + sha256: f64a0333a82f30b0cca061bc3d143813a486dc086b574bfb233b7c1372427694 + url: "https://pub.dev" + source: hosted + version: "4.0.0" + glob: + dependency: transitive + description: + name: glob + sha256: "0e7014b3b7d4dac1ca4d6114f82bf1782ee86745b9b42a92c9289c23d8a0ab63" + url: "https://pub.dev" + source: hosted + version: "2.1.2" + hashlib: + dependency: "direct main" + description: + name: hashlib + sha256: f572f2abce09fc7aee53f15927052b9732ea1053e540af8cae211111ee0b99b1 + url: "https://pub.dev" + source: hosted + version: "1.21.0" + hashlib_codecs: + dependency: transitive + description: + name: hashlib_codecs + sha256: "8cea9ccafcfeaa7324d2ae52c61c69f7ff71f4237507a018caab31b9e416e3b1" + url: "https://pub.dev" + source: hosted + version: "2.6.0" + http_multi_server: + dependency: transitive + description: + name: http_multi_server + sha256: "97486f20f9c2f7be8f514851703d0119c3596d14ea63227af6f7a481ef2b2f8b" + url: "https://pub.dev" + source: hosted + version: "3.2.1" + http_parser: + dependency: transitive + description: + name: http_parser + sha256: "76d306a1c3afb33fe82e2bbacad62a61f409b5634c915fceb0d799de1a913360" + url: "https://pub.dev" + source: hosted + version: "4.1.1" + io: + dependency: transitive + description: + name: io + sha256: "2ec25704aba361659e10e3e5f5d672068d332fc8ac516421d483a11e5cbd061e" + url: "https://pub.dev" + source: hosted + version: "1.0.4" + js: + dependency: transitive + description: + name: js + sha256: c1b2e9b5ea78c45e1a0788d29606ba27dc5f71f019f32ca5140f61ef071838cf + url: "https://pub.dev" + source: hosted + version: "0.7.1" + lints: + dependency: "direct dev" + description: + name: lints + sha256: "3315600f3fb3b135be672bf4a178c55f274bebe368325ae18462c89ac1e3b413" + url: "https://pub.dev" + source: hosted + version: "5.0.0" + logging: + dependency: transitive + description: + name: logging + sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61 + url: "https://pub.dev" + source: hosted + version: "1.3.0" + macros: + dependency: transitive + description: + name: macros + sha256: "0acaed5d6b7eab89f63350bccd82119e6c602df0f391260d0e32b5e23db79536" + url: "https://pub.dev" + source: hosted + version: "0.1.2-main.4" + matcher: + dependency: transitive + description: + name: matcher + sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb + url: "https://pub.dev" + source: hosted + version: "0.12.16+1" + meta: + dependency: transitive + description: + name: meta + sha256: e3641ec5d63ebf0d9b41bd43201a66e3fc79a65db5f61fc181f04cd27aab950c + url: "https://pub.dev" + source: hosted + version: "1.16.0" + mime: + dependency: transitive + description: + name: mime + sha256: "41a20518f0cb1256669420fdba0cd90d21561e560ac240f26ef8322e45bb7ed6" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + node_preamble: + dependency: transitive + description: + name: node_preamble + sha256: "6e7eac89047ab8a8d26cf16127b5ed26de65209847630400f9aefd7cd5c730db" + url: "https://pub.dev" + source: hosted + version: "2.0.2" + package_config: + dependency: transitive + description: + name: package_config + sha256: "1c5b77ccc91e4823a5af61ee74e6b972db1ef98c2ff5a18d3161c982a55448bd" + url: "https://pub.dev" + source: hosted + version: "2.1.0" + path: + dependency: transitive + description: + name: path + sha256: "75cca69d1490965be98c73ceaea117e8a04dd21217b37b292c9ddbec0d955bc5" + url: "https://pub.dev" + source: hosted + version: "1.9.1" + pool: + dependency: transitive + description: + name: pool + sha256: "20fe868b6314b322ea036ba325e6fc0711a22948856475e2c2b6306e8ab39c2a" + url: "https://pub.dev" + source: hosted + version: "1.5.1" + protobuf: + dependency: "direct main" + description: + name: protobuf + sha256: "68645b24e0716782e58948f8467fd42a880f255096a821f9e7d0ec625b00c84d" + url: "https://pub.dev" + source: hosted + version: "3.1.0" + pub_semver: + dependency: transitive + description: + name: pub_semver + sha256: "40d3ab1bbd474c4c2328c91e3a7df8c6dd629b79ece4c4bd04bee496a224fb0c" + url: "https://pub.dev" + source: hosted + version: "2.1.4" + shelf: + dependency: transitive + description: + name: shelf + sha256: e7dd780a7ffb623c57850b33f43309312fc863fb6aa3d276a754bb299839ef12 + url: "https://pub.dev" + source: hosted + version: "1.4.2" + shelf_packages_handler: + dependency: transitive + description: + name: shelf_packages_handler + sha256: "89f967eca29607c933ba9571d838be31d67f53f6e4ee15147d5dc2934fee1b1e" + url: "https://pub.dev" + source: hosted + version: "3.0.2" + shelf_static: + dependency: transitive + description: + name: shelf_static + sha256: c87c3875f91262785dade62d135760c2c69cb217ac759485334c5857ad89f6e3 + url: "https://pub.dev" + source: hosted + version: "1.1.3" + shelf_web_socket: + dependency: transitive + description: + name: shelf_web_socket + sha256: "073c147238594ecd0d193f3456a5fe91c4b0abbcc68bf5cd95b36c4e194ac611" + url: "https://pub.dev" + source: hosted + version: "2.0.0" + source_map_stack_trace: + dependency: transitive + description: + name: source_map_stack_trace + sha256: c0713a43e323c3302c2abe2a1cc89aa057a387101ebd280371d6a6c9fa68516b + url: "https://pub.dev" + source: hosted + version: "2.1.2" + source_maps: + dependency: transitive + description: + name: source_maps + sha256: "708b3f6b97248e5781f493b765c3337db11c5d2c81c3094f10904bfa8004c703" + url: "https://pub.dev" + source: hosted + version: "0.10.12" + source_span: + dependency: transitive + description: + name: source_span + sha256: "53e943d4206a5e30df338fd4c6e7a077e02254531b138a15aec3bd143c1a8b3c" + url: "https://pub.dev" + source: hosted + version: "1.10.0" + stack_trace: + dependency: transitive + description: + name: stack_trace + sha256: "9f47fd3630d76be3ab26f0ee06d213679aa425996925ff3feffdec504931c377" + url: "https://pub.dev" + source: hosted + version: "1.12.0" + stream_channel: + dependency: transitive + description: + name: stream_channel + sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7 + url: "https://pub.dev" + source: hosted + version: "2.1.2" + string_scanner: + dependency: transitive + description: + name: string_scanner + sha256: "0bd04f5bb74fcd6ff0606a888a30e917af9bd52820b178eaa464beb11dca84b6" + url: "https://pub.dev" + source: hosted + version: "1.4.0" + term_glyph: + dependency: transitive + description: + name: term_glyph + sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84 + url: "https://pub.dev" + source: hosted + version: "1.2.1" + test: + dependency: "direct dev" + description: + name: test + sha256: "713a8789d62f3233c46b4a90b174737b2c04cb6ae4500f2aa8b1be8f03f5e67f" + url: "https://pub.dev" + source: hosted + version: "1.25.8" + test_api: + dependency: transitive + description: + name: test_api + sha256: "664d3a9a64782fcdeb83ce9c6b39e78fd2971d4e37827b9b06c3aa1edc5e760c" + url: "https://pub.dev" + source: hosted + version: "0.7.3" + test_core: + dependency: transitive + description: + name: test_core + sha256: "12391302411737c176b0b5d6491f466b0dd56d4763e347b6714efbaa74d7953d" + url: "https://pub.dev" + source: hosted + version: "0.6.5" + typed_data: + dependency: transitive + description: + name: typed_data + sha256: f9049c039ebfeb4cf7a7104a675823cd72dba8297f264b6637062516699fa006 + url: "https://pub.dev" + source: hosted + version: "1.4.0" + vm_service: + dependency: transitive + description: + name: vm_service + sha256: "0968250880a6c5fe7edc067ed0a13d4bae1577fe2771dcf3010d52c4a9d3ca14" + url: "https://pub.dev" + source: hosted + version: "14.3.1" + watcher: + dependency: transitive + description: + name: watcher + sha256: "3d2ad6751b3c16cf07c7fca317a1413b3f26530319181b37e3b9039b84fc01d8" + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web: + dependency: transitive + description: + name: web + sha256: cd3543bd5798f6ad290ea73d210f423502e71900302dde696f8bff84bf89a1cb + url: "https://pub.dev" + source: hosted + version: "1.1.0" + web_socket: + dependency: transitive + description: + name: web_socket + sha256: "3c12d96c0c9a4eec095246debcea7b86c0324f22df69893d538fcc6f1b8cce83" + url: "https://pub.dev" + source: hosted + version: "0.1.6" + web_socket_channel: + dependency: transitive + description: + name: web_socket_channel + sha256: "9f187088ed104edd8662ca07af4b124465893caf063ba29758f97af57e61da8f" + url: "https://pub.dev" + source: hosted + version: "3.0.1" + webkit_inspection_protocol: + dependency: transitive + description: + name: webkit_inspection_protocol + sha256: "87d3f2333bb240704cd3f1c6b5b7acd8a10e7f0bc28c28dcf14e782014f4a572" + url: "https://pub.dev" + source: hosted + version: "1.2.1" + yaml: + dependency: transitive + description: + name: yaml + sha256: "75769501ea3489fca56601ff33454fe45507ea3bfb014161abc3b43ae25989d5" + url: "https://pub.dev" + source: hosted + version: "3.1.2" +sdks: + dart: ">=3.5.0 <4.0.0" diff --git a/drivers/rethinkdb/pubspec.yaml b/drivers/rethinkdb/pubspec.yaml new file mode 100644 index 0000000..5106f31 --- /dev/null +++ b/drivers/rethinkdb/pubspec.yaml @@ -0,0 +1,19 @@ +name: platform_driver_rethinkdb +description: A dart driver for connecting to RethinkDB, the open-source database for the realtime web +version: 1.1.0 +homepage: https://github.com/dart-backend/belatuk-rethinkdb +repository: https://github.com/dart-backend/belatuk-rethinkdb + +environment: + sdk: ">=3.5.0 <4.0.0" + +dependencies: + protobuf: ^3.1.0 + fixnum: ^1.0.0 + convert: ^3.0.1 + hashlib: ^1.17.0 + crypto: ^3.0.1 + +dev_dependencies: + test: ^1.25.0 + lints: ^5.0.0 \ No newline at end of file diff --git a/drivers/rethinkdb/test/aggregation_test.dart b/drivers/rethinkdb/test/aggregation_test.dart new file mode 100644 index 0000000..1cdc23a --- /dev/null +++ b/drivers/rethinkdb/test/aggregation_test.dart @@ -0,0 +1,95 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? databaseName; + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + + if (databaseName == null) { + String dbName = await r.uuid().run(connection!); + databaseName = "test_database_${dbName.replaceAll("-", "")}"; + } + + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + } + connection!.use(testDbName!); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + connection!.close(); + } else { + connection!.close(); + } + }); + + group("count command -> ", () { + test("should count an array", () async { + int count = await r.expr([1, 2, 3]).count().run(connection); + expect(count, equals(3)); + }); + + test("should count an array with a filter", () async { + int count = await r.expr([2, 1, 2, 3, 2, 2]).count(2).run(connection); + expect(count, equals(4)); + }); + + test("should count items in a table", () async { + await r.tableCreate(tableName!).run(connection!); + List testData = [ + {"id": 1}, + {"id": 2}, + {"id": 3}, + {"id": 4}, + {"id": 5} + ]; + await r.table(tableName!).insert(testData).run(connection!); + int count = await r.table(tableName!).count().run(connection!); + expect(count, equals(5)); + }); + + test("should count items in a table with a filter", () async { + List testData = [ + {"id": 6, "age": 21}, + {"id": 7, "age": 22}, + {"id": 8, "age": 21}, + {"id": 9, "age": 33}, + {"id": 10, "age": 34} + ]; + await r.table(tableName!).insert(testData).run(connection!); + int count = await r.table(tableName!)('age').count(21).run(connection); + expect(count, equals(2)); + + count = await r.table(tableName!).count((user) { + return user('id').lt(8); + }).run(connection!); + + expect(count, equals(7)); + }); + + test("remove the test database", () async { + Map response = await r.dbDrop(testDbName!).run(connection!); + + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + expect(response['tables_dropped'], equals(1)); + }); + }); +} diff --git a/drivers/rethinkdb/test/comparison_operators_test.dart b/drivers/rethinkdb/test/comparison_operators_test.dart new file mode 100644 index 0000000..bf5d8d6 --- /dev/null +++ b/drivers/rethinkdb/test/comparison_operators_test.dart @@ -0,0 +1,615 @@ +import 'package:test/test.dart'; +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + connection!.use(testDbName!); + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName!).run(connection!); + } + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + connection!.close(); + } else { + connection!.close(); + } + }); + + group("eq command -> ", () { + test("should check two equal values with eq", () async { + var result = await r.expr(0).eq(0).run(connection); + expect(result, equals(true)); + }); + test("should check two different values with eq", () async { + var result = await r.expr(0).eq(1).run(connection); + expect(result, equals(false)); + }); + test("should check three equal values with eq", () async { + var result = await r.expr(0).eq(0).eq(true).run(connection); + expect(result, equals(true)); + }); + test("should check three different values with eq", () async { + var result = await r.expr(0).eq(1).eq(true).run(connection); + expect(result, equals(false)); + }); + test("should check two equal values together with eq", () async { + var result = await r.eq(0, 0).run(connection); + expect(result, equals(true)); + }); + test("should check two different values together with eq", () async { + var result = await r.eq(0, 1).run(connection); + expect(result, equals(false)); + }); + test("should check three equal values together with eq", () async { + var result = await r.eq(0, 0, 0).run(connection); + expect(result, equals(true)); + }); + test("should check one different and two equal values together with eq", + () async { + var result = await r.eq(0, 1, 1).run(connection); + expect(result, equals(false)); + }); + test("should check three different values together with eq", () async { + var result = await r.eq(0, 1, 2).run(connection); + expect(result, equals(false)); + }); + test("should use args with eq to compare multiple equal values", () async { + var vals = [10, 10, 10]; + var result = await r.eq(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test( + "should use args with eq to compare multiple different values (one different from the others)", + () async { + var vals = [10, 20, 20]; + var result = await r.eq(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use args with eq to compare multiple different values", + () async { + var vals = [10, 20, 30]; + var result = await r.eq(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use two args with eq to compare multiple equal values", + () async { + var vals1 = [10, 10, 10]; + var vals2 = [20, 20, 20]; + var result = + await r.eq(r.args(vals1)).eq(r.eq(r.args(vals2))).run(connection); + expect(result, equals(true)); + }); + test("should use two args with eq to compare multiple different values", + () async { + var vals1 = [10, 10, 10]; + var vals2 = [10, 20, 20]; + var result = + await r.eq(r.args(vals1)).eq(r.eq(r.args(vals2))).run(connection); + expect(result, equals(false)); + }); + }); + + group("ne command -> ", () { + test("should check two equal values with ne", () async { + var result = await r.expr(0).ne(0).run(connection); + expect(result, equals(false)); + }); + test("should check two different values with ne", () async { + var result = await r.expr(0).ne(1).run(connection); + expect(result, equals(true)); + }); + test("should check three equal values with ne", () async { + var result = await r.expr(0).ne(0).ne(false).run(connection); + expect(result, equals(false)); + }); + test("should check three different values with ne", () async { + var result = await r.expr(0).ne(1).ne(false).run(connection); + expect(result, equals(true)); + }); + test("should check two equal values together with ne", () async { + var result = await r.ne(0, 0).run(connection); + expect(result, equals(false)); + }); + test("should check two different values together with ne", () async { + var result = await r.ne(0, 1).run(connection); + expect(result, equals(true)); + }); + test("should check three equal values together with ne", () async { + var result = await r.ne(0, 0, 0).run(connection); + expect(result, equals(false)); + }); + test("should check one different and two equal values together with ne", + () async { + var result = await r.ne(0, 1, 1).run(connection); + expect(result, equals(true)); + }); + test("should check three different values together with ne", () async { + var result = await r.ne(0, 1, 2).run(connection); + expect(result, equals(true)); + }); + test("should use args with ne to compare multiple equal values", () async { + var vals = [10, 10, 10]; + var result = await r.ne(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with ne to compare multiple different values (one different from the others)", + () async { + var vals = [10, 20, 20]; + var result = await r.ne(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test("should use args with ne to compare multiple different values", + () async { + var vals = [10, 20, 30]; + var result = await r.ne(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test("should use two args with ne to compare multiple equal values", + () async { + var vals1 = [10, 10, 10]; + var vals2 = [20, 20, 20]; + var result = + await r.ne(r.args(vals1)).ne(r.ne(r.args(vals2))).run(connection); + expect(result, equals(false)); + }); + test("should use two args with ne to compare multiple different values", + () async { + var vals1 = [10, 10, 10]; + var vals2 = [10, 20, 20]; + var result = + await r.ne(r.args(vals1)).ne(r.ne(r.args(vals2))).run(connection); + expect(result, equals(true)); + }); + }); + + group("lt command -> ", () { + test("should check two equal values with lt", () async { + var result = await r.expr(0).lt(0).run(connection); + expect(result, equals(false)); + }); + test("should check two increasing values with lt", () async { + var result = await r.expr(0).lt(1).run(connection); + expect(result, equals(true)); + }); + test("should check two decreasing values with lt", () async { + var result = await r.expr(1).lt(0).run(connection); + expect(result, equals(false)); + }); + test("should check two equal values together with lt", () async { + var result = await r.lt(0, 0).run(connection); + expect(result, equals(false)); + }); + test("should check two increasing values together with lt", () async { + var result = await r.lt(0, 1).run(connection); + expect(result, equals(true)); + }); + test("should check two decreasing values together with lt", () async { + var result = await r.lt(1, 0).run(connection); + expect(result, equals(false)); + }); + test("should check three equal values together with lt", () async { + var result = await r.lt(0, 0, 0).run(connection); + expect(result, equals(false)); + }); + test("should check three increasing values together with lt", () async { + var result = await r.lt(0, 1, 2).run(connection); + expect(result, equals(true)); + }); + test("should check three decreasing values together with lt", () async { + var result = await r.lt(2, 1, 0).run(connection); + expect(result, equals(false)); + }); + test( + "should check one lower and two higher and equal values together with lt", + () async { + var result = await r.lt(0, 1, 1).run(connection); + expect(result, equals(false)); + }); + test( + "should check two lower and equal and one higher value together with lt", + () async { + var result = await r.lt(0, 0, 1).run(connection); + expect(result, equals(false)); + }); + test( + "should check two higher and equal and one lower values together with lt", + () async { + var result = await r.lt(1, 1, 0).run(connection); + expect(result, equals(false)); + }); + test( + "should check one higher and two lower and equal value together with lt", + () async { + var result = await r.lt(1, 0, 0).run(connection); + expect(result, equals(false)); + }); + test("should use args with lt to compare multiple equal values", () async { + var vals = [10, 10, 10]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with lt to compare multiple values (one lower and two higher and equal)", + () async { + var vals = [10, 20, 20]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with lt to compare multiple values (two lower and equal and one higher)", + () async { + var vals = [10, 10, 20]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with lt to compare multiple values (one higher and two lower and equal)", + () async { + var vals = [20, 10, 10]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with lt to compare multiple values (two higher and equal and one lower)", + () async { + var vals = [20, 20, 10]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use args with lt to compare multiple increasing values", + () async { + var vals = [10, 20, 30]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test("should use args with lt to compare multiple decreasing values", + () async { + var vals = [30, 20, 10]; + var result = await r.lt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + }); + + group("le command -> ", () { + test("should check two equal values with le", () async { + var result = await r.expr(0).le(0).run(connection); + expect(result, equals(true)); + }); + test("should check two increasing values with le", () async { + var result = await r.expr(0).le(1).run(connection); + expect(result, equals(true)); + }); + test("should check two decreasing values with le", () async { + var result = await r.expr(1).le(0).run(connection); + expect(result, equals(false)); + }); + test("should check two equal values together with le", () async { + var result = await r.le(0, 0).run(connection); + expect(result, equals(true)); + }); + test("should check two increasing values together with le", () async { + var result = await r.le(0, 1).run(connection); + expect(result, equals(true)); + }); + test("should check two decreasing values together with le", () async { + var result = await r.le(1, 0).run(connection); + expect(result, equals(false)); + }); + test("should check three equal values together with le", () async { + var result = await r.le(0, 0, 0).run(connection); + expect(result, equals(true)); + }); + test("should check three increasing values together with le", () async { + var result = await r.le(0, 1, 2).run(connection); + expect(result, equals(true)); + }); + test("should check three decreasing values together with le", () async { + var result = await r.le(2, 1, 0).run(connection); + expect(result, equals(false)); + }); + test( + "should check one lower and two higher and equal values together with le", + () async { + var result = await r.le(0, 1, 1).run(connection); + expect(result, equals(true)); + }); + test( + "should check two lower and equal and one higher value together with le", + () async { + var result = await r.le(0, 0, 1).run(connection); + expect(result, equals(true)); + }); + test( + "should check two higher and equal and one lower values together with le", + () async { + var result = await r.le(1, 1, 0).run(connection); + expect(result, equals(false)); + }); + test( + "should check one higher and two lower and equal value together with le", + () async { + var result = await r.le(1, 0, 0).run(connection); + expect(result, equals(false)); + }); + test("should use args with le to compare multiple equal values", () async { + var vals = [10, 10, 10]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test( + "should use args with le to compare multiple values (one lower and two higher and equal)", + () async { + var vals = [10, 20, 20]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test( + "should use args with le to compare multiple values (two lower and equal and one higher)", + () async { + var vals = [10, 10, 20]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test( + "should use args with le to compare multiple values (one higher and two lower and equal)", + () async { + var vals = [20, 10, 10]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with le to compare multiple values (two higher and equal and one lower)", + () async { + var vals = [20, 20, 10]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use args with le to compare multiple increasing values", + () async { + var vals = [10, 20, 30]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test("should use args with le to compare multiple decreasing values", + () async { + var vals = [30, 20, 10]; + var result = await r.le(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + }); + + group("gt command -> ", () { + test("should check two equal values with gt", () async { + var result = await r.expr(0).gt(0).run(connection); + expect(result, equals(false)); + }); + test("should check two increasing values with gt", () async { + var result = await r.expr(0).gt(1).run(connection); + expect(result, equals(false)); + }); + test("should check two decreasing values with gt", () async { + var result = await r.expr(1).gt(0).run(connection); + expect(result, equals(true)); + }); + test("should check two equal values together with gt", () async { + var result = await r.gt(0, 0).run(connection); + expect(result, equals(false)); + }); + test("should check two increasing values together with gt", () async { + var result = await r.gt(0, 1).run(connection); + expect(result, equals(false)); + }); + test("should check two decreasing values together with gt", () async { + var result = await r.gt(1, 0).run(connection); + expect(result, equals(true)); + }); + test("should check three equal values together with gt", () async { + var result = await r.gt(0, 0, 0).run(connection); + expect(result, equals(false)); + }); + test("should check three increasing values together with gt", () async { + var result = await r.gt(0, 1, 2).run(connection); + expect(result, equals(false)); + }); + test("should check three decreasing values together with gt", () async { + var result = await r.gt(2, 1, 0).run(connection); + expect(result, equals(true)); + }); + test( + "should check one lower and two higher and equal values together with gt", + () async { + var result = await r.gt(0, 1, 1).run(connection); + expect(result, equals(false)); + }); + test( + "should check two lower and equal and one higher value together with gt", + () async { + var result = await r.gt(0, 0, 1).run(connection); + expect(result, equals(false)); + }); + test( + "should check two higher and equal and one lower values together with gt", + () async { + var result = await r.gt(1, 1, 0).run(connection); + expect(result, equals(false)); + }); + test( + "should check one higher and two lower and equal value together with gt", + () async { + var result = await r.gt(1, 0, 0).run(connection); + expect(result, equals(false)); + }); + test("should use args with gt to compare multiple equal values", () async { + var vals = [10, 10, 10]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with gt to compare multiple values (one lower and two higher and equal)", + () async { + var vals = [10, 20, 20]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with gt to compare multiple values (two lower and equal and one higher)", + () async { + var vals = [10, 10, 20]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with gt to compare multiple values (one higher and two lower and equal)", + () async { + var vals = [20, 10, 10]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with gt to compare multiple values (two higher and equal and one lower)", + () async { + var vals = [20, 20, 10]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use args with gt to compare multiple increasing values", + () async { + var vals = [10, 20, 30]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use args with gt to compare multiple decreasing values", + () async { + var vals = [30, 20, 10]; + var result = await r.gt(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + }); + + group("ge command -> ", () { + test("should check two equal values with ge", () async { + var result = await r.expr(0).ge(0).run(connection); + expect(result, equals(true)); + }); + test("should check two increasing values with ge", () async { + var result = await r.expr(0).ge(1).run(connection); + expect(result, equals(false)); + }); + test("should check two decreasing values with ge", () async { + var result = await r.expr(1).ge(0).run(connection); + expect(result, equals(true)); + }); + test("should check two equal values together with ge", () async { + var result = await r.ge(0, 0).run(connection); + expect(result, equals(true)); + }); + test("should check two increasing values together with ge", () async { + var result = await r.ge(0, 1).run(connection); + expect(result, equals(false)); + }); + test("should check two decreasing values together with ge", () async { + var result = await r.ge(1, 0).run(connection); + expect(result, equals(true)); + }); + test("should check three equal values together with ge", () async { + var result = await r.ge(0, 0, 0).run(connection); + expect(result, equals(true)); + }); + test("should check three increasing values together with ge", () async { + var result = await r.ge(0, 1, 2).run(connection); + expect(result, equals(false)); + }); + test("should check three decreasing values together with ge", () async { + var result = await r.ge(2, 1, 0).run(connection); + expect(result, equals(true)); + }); + test( + "should check one lower and two higher and equal values together with ge", + () async { + var result = await r.ge(0, 1, 1).run(connection); + expect(result, equals(false)); + }); + test( + "should check two lower and equal and one higher value together with ge", + () async { + var result = await r.ge(0, 0, 1).run(connection); + expect(result, equals(false)); + }); + test( + "should check two higher and equal and one lower values together with ge", + () async { + var result = await r.ge(1, 1, 0).run(connection); + expect(result, equals(true)); + }); + test( + "should check one higher and two lower and equal value together with ge", + () async { + var result = await r.ge(1, 0, 0).run(connection); + expect(result, equals(true)); + }); + test("should use args with ge to compare multiple equal values", () async { + var vals = [10, 10, 10]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test( + "should use args with ge to compare multiple values (one lower and two higher and equal)", + () async { + var vals = [10, 20, 20]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with ge to compare multiple values (two lower and equal and one higher)", + () async { + var vals = [10, 10, 20]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test( + "should use args with ge to compare multiple values (one higher and two lower and equal)", + () async { + var vals = [20, 10, 10]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test( + "should use args with ge to compare multiple values (two higher and equal and one lower)", + () async { + var vals = [20, 20, 10]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + test("should use args with ge to compare multiple increasing values", + () async { + var vals = [10, 20, 30]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(false)); + }); + test("should use args with ge to compare multiple decreasing values", + () async { + var vals = [30, 20, 10]; + var result = await r.ge(r.args(vals)).run(connection); + expect(result, equals(true)); + }); + }); +} diff --git a/drivers/rethinkdb/test/connection_test.dart b/drivers/rethinkdb/test/connection_test.dart new file mode 100644 index 0000000..e809900 --- /dev/null +++ b/drivers/rethinkdb/test/connection_test.dart @@ -0,0 +1,64 @@ +import 'dart:async'; + +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + + test("connect() connects with defaults if no params are passed", () async { + Connection c = await r.connect(); + expect(c, isNot(null)); + c.close(); + }); + + test("connect() connects with non-default if params are passed", () async { + Connection conn = await r.connect( + db: 'testDB', + host: "localhost", + port: 28015, + user: "admin", + password: ""); + + expect(conn, isNot(null)); + conn.close(); + }); + + test("connection should run onconnect and onclose listeners", () async { + int connectCounter = 0; + int closeCounter = 0; + f() => connectCounter++; + fClose() => closeCounter++; + Connection conn = await r.connect(); + + expect(connectCounter, equals(0)); + conn.on('connect', f); + conn.on('close', fClose); + expect(closeCounter, equals(0)); + conn.close(); + conn.close(); + expect(closeCounter, equals(1)); + Connection c = await conn.connect(); + + expect(connectCounter, equals(1)); + c.close(); + }); + + test("connections with noreplywait should return a Future", () async { + Connection conn = await r.connect(); + var fut = conn.noreplyWait(); + expect(fut is Future, equals(true)); + conn.close(); + }); + + test("connections should return server info", () async { + Connection conn = await r.connect(); + Map m = await conn.server(); + + expect(m.keys.length, equals(3)); + expect(m.containsKey('id'), equals(true)); + expect(m.containsKey('name'), equals(true)); + expect(m.containsKey('proxy'), equals(true)); + conn.close(); + }); +} diff --git a/drivers/rethinkdb/test/data_test.dart b/drivers/rethinkdb/test/data_test.dart new file mode 100644 index 0000000..def1561 --- /dev/null +++ b/drivers/rethinkdb/test/data_test.dart @@ -0,0 +1,226 @@ +import 'dart:async'; + +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUpTable() async { + return await r.table(tableName!).insert([ + { + 'id': 1, + 'name': 'Jane Doe', + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ] + }, + { + 'id': 2, + 'name': 'Jon Doe', + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'nickname': 'Jo' + }, + {'id': 3, 'name': 'Firstname Last'} + ]).run(connection!); + } + + setUp(() async { + connection = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + connection!.use(testDbName!); + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName!).run(connection!); + } + await setUpTable(); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + connection!.close(); + } else { + connection!.close(); + } + }); + + group("contains command -> ", () { + test("should check if the table contains an existing child", () async { + var result = await r + .table(tableName!) + .get(1)('children') + .contains({'id': 1, 'name': 'Robert'}).run(connection!); + expect(result, equals(true)); + }); + test("should check if the table contains an non existing child", () async { + var result = await r + .table(tableName!) + .get(1)('children') + .contains({'id': 1, 'name': 'Robert 1'}).run(connection!); + expect(result, equals(false)); + }); + // TODO: add more tests. + }); + + group("hasFields command -> ", () { + test("should use hasFields and return the people who have children", + () async { + Cursor parents = + await r.table(tableName!).hasFields('children').run(connection!); + //expect(parents is Cursor, equals(true)); + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(2)); + + expect(parentsList[1]['id'], equals(1)); + expect(parentsList[0]['id'], equals(2)); + + expect(parentsList[1]['name'], equals('Jane Doe')); + expect(parentsList[0]['name'], equals('Jon Doe')); + }); + test( + "should use hasFields and return the people who have children and a nickname", + () async { + Cursor parentsWithNickname = await r + .table(tableName!) + .hasFields('children', 'nickname') + .run(connection!); + //expect(parentsWithNickname is Cursor, equals(true)); + List parentsWithNicknameList = await parentsWithNickname.toList(); + + expect(parentsWithNicknameList.length, equals(1)); + + expect(parentsWithNicknameList[0]['id'], equals(2)); + + expect(parentsWithNicknameList[0]['name'], equals('Jon Doe')); + }); + // TODO: add more tests. + }); + + group("withFields command -> ", () { + test( + "should use withFields and return the children of the people who have them", + () async { + Cursor parents = + await r.table(tableName!).withFields('children').run(connection!); + //expect(parents is Cursor, equals(true)); + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(2)); + + expect( + parentsList[1]['children'], + equals([ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ])); + expect( + parentsList[0]['children'], + equals([ + {'id': 1, 'name': 'Louis'} + ])); + }); + test( + "should use withFields and return the children and the nickname of the people who have them", + () async { + Cursor parentsWithNickname = await r + .table(tableName!) + .withFields('children', 'nickname') + .run(connection!); + //expect(parentsWithNickname is Cursor, equals(true)); + List parentsWithNicknameList = await parentsWithNickname.toList(); + + expect(parentsWithNicknameList.length, equals(1)); + + expect( + parentsWithNicknameList[0]['children'], + equals([ + {'id': 1, 'name': 'Louis'} + ])); + + expect(parentsWithNicknameList[0]['nickname'], equals('Jo')); + }); + }); + + group("keys command -> ", () { + test("should return the keys from the person with id equal to 1", () async { + var personKeys = await r.table(tableName!).get(1).keys().run(connection!); + expect(personKeys is List, equals(true)); + + expect(personKeys.length, equals(3)); + + expect(personKeys[0], equals("children")); + expect(personKeys[1], equals("id")); + expect(personKeys[2], equals("name")); + }); + }); + + group("values command -> ", () { + test("should return the values from the person with id equal to 1", + () async { + var personValues = + await r.table(tableName!).get(1).values().run(connection!); + expect(personValues is List, equals(true)); + + expect(personValues.length, equals(3)); + + expect( + personValues[0], + equals([ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ])); + expect(personValues[1], equals(1)); + expect(personValues[2], equals('Jane Doe')); + }); + }); + + group("changes command -> ", () { + test("should return the changes from the person that is updated", () async { + Feed feed = + await r.table(tableName!).changes().run(connection!).asStream().first; + Timer(Duration(seconds: 1), () async { + var result = await r + .table(tableName!) + .get(1) + .update({'name': "Marcelo Neppel"}).run(connection!); + print("result: $result"); + }); + dynamic feedData = await feed.first; + expect( + feedData, + equals({ + 'new_val': { + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 1, + 'name': 'Marcelo Neppel' + }, + 'old_val': { + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 1, + 'name': 'Jane Doe' + } + })); + }); + }); +} diff --git a/drivers/rethinkdb/test/geojson_test.dart b/drivers/rethinkdb/test/geojson_test.dart new file mode 100644 index 0000000..688dc11 --- /dev/null +++ b/drivers/rethinkdb/test/geojson_test.dart @@ -0,0 +1,451 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + var r = RethinkDb() as dynamic; + String? databaseName; + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(connection); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName).run(connection); + } + if (databaseName == null) { + String dbName = await r.uuid().run(connection); + databaseName = "test_database_${dbName.replaceAll("-", "")}"; + } + if (tableName == null) { + String tblName = await r.uuid().run(connection); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + } + connection!.use(testDbName!); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName).run(connection); + } + connection!.close(); + }); + + group("circle command -> ", () { + int long = -90; + int lat = 0; + int rad = 5; + test( + "should create a polygon given an array containing longitude and latitude and also a radius", + () async { + Map response = await r.circle([long, lat], rad).run(connection); + + expect(response.containsKey('coordinates'), equals(true)); + expect(response.containsKey('type'), equals(true)); + expect(response['type'], equals('Polygon')); + }); + + test("should create a polygon given a point and also a radius", () async { + Point p = r.point(long, lat); + Map response = await r.circle(p, rad).run(connection); + + expect(response.containsKey('coordinates'), equals(true)); + expect(response.containsKey('type'), equals(true)); + expect(response['type'], equals('Polygon')); + }); + + test("should create a polygon with a specified number of vertices", + () async { + Point p = r.point(long, lat); + + Map response = + await r.circle(p, rad, {'num_vertices': 4}).run(connection); + + expect(response.containsKey('coordinates'), equals(true)); + expect(response['coordinates'][0].length, equals(5)); + expect(response.containsKey('type'), equals(true)); + expect(response['type'], equals('Polygon')); + }); + + test("should create a polygon with a specified geo_system", () async { + rad = 1; + Point p = r.point(long, lat); + + Map response = + await r.circle(p, rad, {'geo_system': 'unit_sphere'}).run(connection); + + expect(response.containsKey('coordinates'), equals(true)); + expect(response.containsKey('type'), equals(true)); + expect(response['type'], equals('Polygon')); + }); + + test("should create a polygon with a specified unit", () async { + rad = 1; + Point p = r.point(long, lat); + + Map response = await r + .circle(p, rad, {'num_vertices': 3, 'unit': 'nm'}).run(connection); + + expect(response.containsKey('coordinates'), equals(true)); + expect(response.containsKey('type'), equals(true)); + expect(response['type'], equals('Polygon')); + }); + + test("should create an unfilled line", () async { + Point p = r.point(long, lat); + + Map response = await r + .circle(p, rad, {'num_vertices': 4, 'fill': false}).run(connection); + + expect(response.containsKey('coordinates'), equals(true)); + expect(response.containsKey('type'), equals(true)); + expect(response['type'], equals('LineString')); + }); + }); + + group("distance command -> ", () { + Circle c = r.circle([-90, 0], 1); + Point p = r.point(0, -90); + + test("should compute the distance between a point and a polygon", () async { + num distance = await r.distance(p, c).run(connection); + + expect(distance, equals(10001964.729312724)); + }); + + test("should compute the distance for a given geo_system", () async { + num distance = + await r.distance(p, c, {'geo_system': 'unit_sphere'}).run(connection); + + expect(distance, equals(1.5707961689526464)); + }); + + test("should compute the distance for a given unit", () async { + num distance = await r.distance( + p, c, {'geo_system': 'unit_sphere', 'unit': 'ft'}).run(connection); + expect(distance, equals(5.153530738033616)); + }); + }); + + test("geojson command -> ", () async { + var rqlGeo = await r.geojson({ + 'type': 'Point', + 'coordinates': [-122.423246, 37.779388] + }).run(connection); + expect(rqlGeo.containsKey('coordinates'), equals(true)); + expect(rqlGeo['type'], equals('Point')); + }); + + group("toGeojson command -> ", () { + test("should convert a reql geometry to a GeoJSON object", () async { + Map geo = await r.point(0, 0).toGeojson().run(connection); + + expect(geo.containsKey('coordinates'), equals(true)); + expect(geo['coordinates'], equals([0, 0])); + expect(geo.containsKey('type'), equals(true)); + expect(geo['type'], equals('Point')); + }); + }); + + group("includes command -> ", () { + test("should return true if a geometry includes some other geometry", + () async { + Point point1 = r.point(-117.220406, 32.719464); + Point point2 = r.point(-117.206201, 32.725186); + bool doesInclude = + await r.circle(point1, 2000).includes(point2).run(connection); + + expect(doesInclude, equals(true)); + }); + + test( + "should return false if a geometry does not include some other geometry", + () async { + Point point1 = r.point(-0, 0); + Point point2 = r.point(-100, 90); + bool doesInclude = + await r.circle(point1, 1).includes(point2).run(connection); + + expect(doesInclude, equals(false)); + }); + + test( + "should filter a sequence to only contain items that include some other geometry", + () async { + Point point1 = r.point(-0, 0); + Point point2 = r.point(-1, 1); + Point point3 = r.point(-99, 90); + Point point4 = r.point(101, 90); + Point point5 = r.point(-100, 90); + List included = await r + .expr([ + r.circle(point1, 2), + r.circle(point2, 2), + r.circle(point3, 2), + r.circle(point4, 2) + ]) + .includes(point5) + .run(connection); + + expect(included.length == 2, equals(true)); + }); + }); + + group("intersects command -> ", () { + test("should return true if a geometry intersects some other geometry", + () async { + Point point1 = r.point(-117.220406, 32.719464); + Line line = r.line(r.point(-117.206201, 32.725186), r.point(0, 1)); + bool doesIntersect = + await r.circle(point1, 2000).intersects(line).run(connection); + + expect(doesIntersect, equals(true)); + }); + + test( + "should return false if a geometry does not intersect some other geometry", + () async { + Point point1 = r.point(-117.220406, 32.719464); + Line line = r.line(r.point(20, 20), r.point(0, 1)); + bool doesIntersect = + await r.circle(point1, 1).intersects(line).run(connection); + + expect(doesIntersect, equals(false)); + }); + + test( + "should filter a sequence to only contain items that intersect some other geometry", + () async { + var point1 = r.point(0, 0); + var point2 = r.point(33, 30); + var point3 = r.point(-17, 3); + var point4 = r.point(20, 20); + var point5 = r.point(-100, 90); + Line line = r.line(point1, point2); + List intersecting = await r + .expr([point1, point2, point3, point4, point5]) + .intersects(line) + .run(connection); + expect(intersecting.length == 2, equals(true)); + }); + }); + + group("line command -> ", () { + test("should create a line from two long/lat arrays", () async { + Map line = await r.line([0, 0], [-20, -90]).run(connection); + + expect(line.containsKey('coordinates'), equals(true)); + expect( + line['coordinates'], + equals([ + [0, 0], + [-20, -90] + ])); + expect(line.containsKey('type'), equals(true)); + expect(line['type'], equals('LineString')); + }); + test("should create a line from many long/lat arrays", () async { + Map line = await r.line([0, 0], [-20, -90], [3, 3]).run(connection); + + expect(line.containsKey('coordinates'), equals(true)); + expect( + line['coordinates'], + equals([ + [0, 0], + [-20, -90], + [3, 3] + ])); + expect(line.containsKey('type'), equals(true)); + expect(line['type'], equals('LineString')); + }); + test("should create a line from two points", () async { + Map line = await r.line(r.point(0, 0), r.point(-20, -90)).run(connection); + + expect(line.containsKey('coordinates'), equals(true)); + expect( + line['coordinates'], + equals([ + [0, 0], + [-20, -90] + ])); + expect(line.containsKey('type'), equals(true)); + expect(line['type'], equals('LineString')); + }); + test("should create a line from many points", () async { + Map line = await r + .line(r.point(0, 0), r.point(-20, -90), r.point(3, 3)) + .run(connection); + + expect(line.containsKey('coordinates'), equals(true)); + expect( + line['coordinates'], + equals([ + [0, 0], + [-20, -90], + [3, 3] + ])); + expect(line.containsKey('type'), equals(true)); + expect(line['type'], equals('LineString')); + }); + test("should create a line from a combination of arrays and points", + () async { + Map line = await r + .line(r.point(0, 0), [-20, -90], r.point(3, 3)) + .run(connection); + + expect(line.containsKey('coordinates'), equals(true)); + expect( + line['coordinates'], + equals([ + [0, 0], + [-20, -90], + [3, 3] + ])); + expect(line.containsKey('type'), equals(true)); + expect(line['type'], equals('LineString')); + }); + test("should be able to fill() to create a polygon from a line", () async { + Map line = await r + .line(r.point(0, 0), [-20, -90], r.point(3, 3)) + .fill() + .run(connection); + + expect(line.containsKey('coordinates'), equals(true)); + expect( + line['coordinates'], + equals([ + [ + [0, 0], + [-20, -90], + [3, 3], + [0, 0] + ] + ])); + expect(line.containsKey('type'), equals(true)); + expect(line['type'], equals('Polygon')); + }); + }); + test("point command -> should create a point given a longitude and latitude", + () async { + Map point = await r.point(90, 90).run(connection); + + expect(point.containsKey('coordinates'), equals(true)); + expect(point['coordinates'], equals([90, 90])); + expect(point.containsKey('type'), equals(true)); + expect(point['type'], equals('Point')); + }); + + group("polygon command -> ", () { + test("should create a polygon given three long/lat arrays", () async { + Map poly = await r.polygon([0, 0], [40, 40], [20, 0]).run(connection); + expect(poly.containsKey('coordinates'), equals(true)); + expect(poly['coordinates'][0].length, equals(4)); + expect(poly.containsKey('type'), equals(true)); + expect(poly['type'], equals('Polygon')); + }); + + test("should create a polygon given many long/lat arrays", () async { + Map poly = await r.polygon( + [0, 0], [40, 40], [20, 0], [50, -10], [-90, 80]).run(connection); + + expect(poly.containsKey('coordinates'), equals(true)); + expect(poly['coordinates'][0].length, equals(6)); + expect(poly.containsKey('type'), equals(true)); + expect(poly['type'], equals('Polygon')); + }); + + Point point1 = r.point(0, 0); + Point point2 = r.point(40, 0); + Point point3 = r.point(40, 40); + Point point4 = r.point(20, 50); + Point point5 = r.point(0, 40); + test("should create a polygon given three points", () async { + Map poly = await r.polygon(point1, point2, point3).run(connection); + + expect(poly.containsKey('coordinates'), equals(true)); + expect(poly['coordinates'][0].length, equals(4)); + expect(poly.containsKey('type'), equals(true)); + expect(poly['type'], equals('Polygon')); + }); + + test("should create a polygon given many points", () async { + Map poly = await r + .polygon(point1, point2, point3, point4, point5) + .run(connection); + + expect(poly.containsKey('coordinates'), equals(true)); + expect(poly['coordinates'][0].length, equals(6)); + expect(poly.containsKey('type'), equals(true)); + expect(poly['type'], equals('Polygon')); + }); + + test("should allow a subPolygon to be removed from a polygon", () async { + Map poly = await r + .polygon(point1, point2, point3, point4, point5) + .polygonSub(r.circle(r.point(20, 20), 1)) + .run(connection); + + expect(poly.containsKey('coordinates'), equals(true)); + expect(poly['coordinates'][0].length, equals(6)); + expect(poly.containsKey('type'), equals(true)); + expect(poly['type'], equals('Polygon')); + }); + }); + + test( + "getIntersecting command -> should return a cursor containing all intersecting records of a table", + () async { + List insertedData = [ + { + 'location': r.polygon( + r.point(0, 0), r.point(40, 0), r.point(40, 40), r.point(0, 40)), + 'name': 'a' + }, + { + 'location': r.polygon( + r.point(40, 0), r.point(80, 0), r.point(80, 40), r.point(40, 40)), + 'name': 'a' + }, + { + 'location': r.polygon( + r.point(40, 40), r.point(80, 40), r.point(80, 80), r.point(40, 80)), + 'name': 'a' + } + ]; + + await r.tableCreate(tableName).run(connection); + await r + .table(tableName) + .indexCreate('location', {'geo': true}).run(connection); + await r.table(tableName).indexWait('location').run(connection); + await r.table(tableName).insert(insertedData).run(connection); + Cursor intersecting = await r.table(tableName).getIntersecting( + r.circle(r.point(40, 20), 1), {'index': 'location'}).run(connection); + List v = await intersecting.toList(); + + expect(v.length, equals(2)); + }); + + group("getNearest command -> ", () { + test("should get a list of documents nearest a point", () async { + List l = await r + .table(tableName) + .getNearest(r.point(80.5, 20), {'index': 'location'}).run(connection); + + //expect(l is List, equals(true)); + expect(l.length, equals(1)); + }); + }); + + test("remove the test database", () async { + Map response = await r.dbDrop(testDbName).run(connection); + + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + expect(response['tables_dropped'], equals(1)); + }); +} diff --git a/drivers/rethinkdb/test/numeric_operators_test.dart b/drivers/rethinkdb/test/numeric_operators_test.dart new file mode 100644 index 0000000..acafb14 --- /dev/null +++ b/drivers/rethinkdb/test/numeric_operators_test.dart @@ -0,0 +1,342 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + connection!.use(testDbName!); + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName!).run(connection!); + } + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + connection!.close(); + } else { + connection!.close(); + } + }); + + group("add command -> ", () { + test("should add two numbers", () async { + var result = await r.expr(2).add(2).run(connection); + expect(result, equals(4)); + }); + test("should add three numbers", () async { + var result = await r.expr(2).add(2).add(2).run(connection); + expect(result, equals(6)); + }); + test("should add two numbers together", () async { + var result = await r.add(2, 2).run(connection); + expect(result, equals(4)); + }); + test("should add three numbers together", () async { + var result = await r.add(2, 2, 2).run(connection); + expect(result, equals(6)); + }); + test("should concatenate strings", () async { + var result = await r.expr("foo").add("bar", "baz").run(connection); + expect(result, equals("foobarbaz")); + }); + test("should concatenate arrays", () async { + var result = await r.expr(["foo", "bar"]).add(["buzz"]).run(connection); + expect(result, equals(["foo", "bar", "buzz"])); + }); + test("should create a date one year from now", () async { + var result = await r.now().add(365 * 24 * 60 * 60).run(connection); + expect( + DateTime.now().add(Duration(days: 365)).difference(result).inMinutes, + lessThan(1)); + }); + test("should use args with add to sum multiple values", () async { + var vals = [10, 20, 30]; + var result = await r.add(r.args(vals)).run(connection); + expect(result, equals(60)); + }); + test("should use two args with add to sum multiple values", () async { + var vals1 = [10, 20, 30]; + var vals2 = [40, 50, 60]; + var result = + await r.add(r.args(vals1)).add(r.args(vals2)).run(connection); + expect(result, equals(210)); + }); + test("should use two args together with add to sum multiple values", + () async { + var vals1 = [10, 20, 30]; + var vals2 = [40, 50, 60]; + var result = await r.add(r.args(vals1), r.args(vals2)).run(connection); + expect(result, equals(210)); + }); + test("should concatenate an array of strings with args", () async { + var vals = ['foo', 'bar', 'buzz']; + var result = await r.add(r.args(vals)).run(connection); + expect(result, equals("foobarbuzz")); + }); + test("should concatenate two arrays of strings with args", () async { + var vals1 = ['foo', 'bar', 'buzz']; + var vals2 = ['foo1', 'bar1', 'buzz1']; + var result = + await r.add(r.args(vals1)).add(r.args(vals2)).run(connection); + expect(result, equals("foobarbuzzfoo1bar1buzz1")); + }); + test("should concatenate two arrays together of strings with args", + () async { + var vals1 = ['foo', 'bar', 'buzz']; + var vals2 = ['foo1', 'bar1', 'buzz1']; + var result = await r.add(r.args(vals1), r.args(vals2)).run(connection); + expect(result, equals("foobarbuzzfoo1bar1buzz1")); + }); + }); + + group("sub command -> ", () { + test("should sub two numbers", () async { + var result = await r.expr(4).sub(2).run(connection); + expect(result, equals(2)); + }); + test("should sub three numbers", () async { + var result = await r.expr(4).sub(2).sub(2).run(connection); + expect(result, equals(0)); + }); + test("should sub two numbers together", () async { + var result = await r.sub(4, 2).run(connection); + expect(result, equals(2)); + }); + test("should sub three numbers together", () async { + var result = await r.sub(4, 2, 2).run(connection); + expect(result, equals(0)); + }); + test("should create a date one year ago today", () async { + var result = await r.now().sub(365 * 24 * 60 * 60).run(connection); + expect( + DateTime.now() + .subtract(Duration(days: 365)) + .difference(result) + .inMinutes, + lessThan(1)); + }); + test("should retrieve how many seconds elapsed between today and date", + () async { + var date = DateTime(2018); + var result = await r.now().sub(date).run(connection); + expect( + DateTime.now() + .difference(DateTime(2018).add(Duration(minutes: result.round()))) + .inMinutes, + lessThan(1)); + }); + test("should use args with sub to subtract multiple values", () async { + var vals = [30, 20, 10]; + var result = await r.sub(r.args(vals)).run(connection); + expect(result, equals(0)); + }); + test("should use two args with sub to subtract multiple values", () async { + var vals1 = [60, 50, 40]; + var vals2 = [30, 20, 10]; + var result = + await r.sub(r.args(vals1)).sub(r.args(vals2)).run(connection); + expect(result, equals(-90)); + }); + test("should use two args together with sub to subtract multiple values", + () async { + var vals1 = [60, 50, 40]; + var vals2 = [30, 20, 10]; + var result = await r.sub(r.args(vals1), r.args(vals2)).run(connection); + expect(result, equals(-90)); + }); + }); + + group("mul command -> ", () { + test("should mul two numbers", () async { + var result = await r.expr(4).mul(2).run(connection); + expect(result, equals(8)); + }); + test("should mul three numbers", () async { + var result = await r.expr(4).mul(2).mul(2).run(connection); + expect(result, equals(16)); + }); + test("should mul two numbers together", () async { + var result = await r.mul(4, 2).run(connection); + expect(result, equals(8)); + }); + test("should mul three numbers together", () async { + var result = await r.mul(4, 2, 2).run(connection); + expect(result, equals(16)); + }); + test("should multiply an array by a number", () async { + var result = await r + .expr(["This", "is", "the", "song", "that", "never", "ends."]) + .mul(2) + .run(connection); + expect( + result, + equals([ + "This", + "is", + "the", + "song", + "that", + "never", + "ends.", + "This", + "is", + "the", + "song", + "that", + "never", + "ends." + ])); + }); + test("should use args with mul to multiply multiple values", () async { + var vals = [10, 20, 30]; + var result = await r.mul(r.args(vals)).run(connection); + expect(result, equals(6000)); + }); + test("should use two args with mul to multiply multiple values", () async { + var vals1 = [10, 20, 30]; + var vals2 = [40, 50, 60]; + var result = + await r.mul(r.args(vals1)).mul(r.args(vals2)).run(connection); + expect(result, equals(720000000)); + }); + test("should use two args together with mul to multiply multiple values", + () async { + var vals1 = [10, 20, 30]; + var vals2 = [40, 50, 60]; + var result = await r.mul(r.args(vals1), r.args(vals2)).run(connection); + expect(result, equals(720000000)); + }); + }); + + group("div command -> ", () { + test("should div two numbers", () async { + var result = await r.expr(4).div(2).run(connection); + expect(result, equals(2)); + }); + test("should div three numbers", () async { + var result = await r.expr(4).div(2).div(2).run(connection); + expect(result, equals(1)); + }); + test("should div two numbers together", () async { + var result = await r.div(4, 2).run(connection); + expect(result, equals(2)); + }); + test("should div three numbers together", () async { + var result = await r.div(4, 2, 2).run(connection); + expect(result, equals(1)); + }); + test("should use args with div to divide by multiple values", () async { + var vals = [30, 2, 3]; + var result = await r.div(r.args(vals)).run(connection); + expect(result, equals(5)); + }); + test("should use two args with div to divide by multiple values", () async { + var vals1 = [900, 6, 3]; + var vals2 = [2, 5, 5]; + var result = + await r.div(r.args(vals1)).div(r.args(vals2)).run(connection); + expect(result, equals(1)); + }); + test("should use two args together with div to divide by multiple values", + () async { + var vals1 = [900, 6, 3]; + var vals2 = [2, 5, 5]; + var result = await r.div(r.args(vals1), r.args(vals2)).run(connection); + expect(result, equals(1)); + }); + }); + + group("mod command -> ", () { + test("should mod two numbers", () async { + var result = await r.expr(3).mod(2).run(connection); + expect(result, equals(1)); + }); + test("should mod three numbers", () async { + var result = await r.expr(9).mod(5).mod(5).run(connection); + expect(result, equals(4)); + }); + }); + + group("and command -> ", () { + test("should evaluate two false values with and", () async { + var a = false, b = false; + var result = await r.expr(a).and(b).run(connection); + expect(result, equals(false)); + }); + test("should evaluate one false and one true value with and", () async { + var a = true, b = false; + var result = await r.expr(a).and(b).run(connection); + expect(result, equals(false)); + }); + test("should evaluate two true values with and", () async { + var a = true, b = true; + var result = await r.expr(a).and(b).run(connection); + expect(result, equals(true)); + }); + test("should evaluate three false values together with and", () async { + var x = false, y = false, z = false; + var result = await r.and(x, y, z).run(connection); + expect(result, equals(false)); + }); + test("should evaluate one false and two true values together with and", + () async { + var x = false, y = true, z = true; + var result = await r.and(x, y, z).run(connection); + expect(result, equals(false)); + }); + test("should evaluate three true values together with and", () async { + var x = true, y = true, z = true; + var result = await r.and(x, y, z).run(connection); + expect(result, equals(true)); + }); + }); + + group("or command -> ", () { + test("should evaluate two false values with or", () async { + var a = false, b = false; + var result = await r.expr(a).or(b).run(connection); + expect(result, equals(false)); + }); + test("should evaluate one false and one true value with or", () async { + var a = true, b = false; + var result = await r.expr(a).or(b).run(connection); + expect(result, equals(true)); + }); + test("should evaluate two true values with or", () async { + var a = true, b = true; + var result = await r.expr(a).or(b).run(connection); + expect(result, equals(true)); + }); + test("should evaluate three false values together with or", () async { + var x = false, y = false, z = false; + var result = await r.or(x, y, z).run(connection); + expect(result, equals(false)); + }); + test("should evaluate one true and two false values together with or", + () async { + var x = true, y = false, z = false; + var result = await r.or(x, y, z).run(connection); + expect(result, equals(true)); + }); + test("should evaluate three true values together with or", () async { + var x = true, y = true, z = true; + var result = await r.or(x, y, z).run(connection); + expect(result, equals(true)); + }); + }); +} diff --git a/drivers/rethinkdb/test/parallel_execution_test.dart b/drivers/rethinkdb/test/parallel_execution_test.dart new file mode 100644 index 0000000..9e5f4f2 --- /dev/null +++ b/drivers/rethinkdb/test/parallel_execution_test.dart @@ -0,0 +1,79 @@ +import 'dart:async'; + +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +String? testDbName; + +main() { + setUp(() async { + final r = RethinkDb(); + final conn = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(conn); + testDbName = 'parralel_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(conn); + } + conn.close(); + }); + + tearDown(() async { + final r = RethinkDb(); + final conn = await r.connect(); + + await r.dbDrop(testDbName!).run(conn); + + conn.close(); + }); + + test('ParallelExecution', () async { + bool isParallel = await pEx(); + expect(isParallel, equals(true)); + }, timeout: Timeout.factor(4)); +} + +Future pEx() { + final r = RethinkDb(); + return r + .connect(db: testDbName!, port: 28015) + .then((connection) => _queryWhileWriting(connection, r)); +} + +Future _queryWhileWriting(conn, r) async { + //variable that will be set by our faster query + int? total; + + final testCompleter = Completer(); + + //set up some test tables + + try { + await r.tableCreate("emptyTable").run(conn); + await r.tableCreate("bigTable").run(conn); + } catch (err) { + //table exists + } + + //create a big array to write + var bigJson = []; + for (var i = 0; i < 10000; i++) { + bigJson.add({'id': i, 'name': 'a$i'}); + } + + r.table("bigTable").insert(bigJson).run(conn).then((d) { + //remove test tables after test complete + return r.tableDrop("bigTable").run(conn); + }).then((_) { + return r.tableDrop("emptyTable").run(conn); + }).then((_) { + conn.close(); + return testCompleter.complete(total != null); + }); + + //run another query while the insert is running + r.table("emptyTable").count().run(conn).then((t) { + total = t; + }); + + return testCompleter.future; +} diff --git a/drivers/rethinkdb/test/polymorphic_object_sequence_operations_test.dart b/drivers/rethinkdb/test/polymorphic_object_sequence_operations_test.dart new file mode 100644 index 0000000..a5c5fbd --- /dev/null +++ b/drivers/rethinkdb/test/polymorphic_object_sequence_operations_test.dart @@ -0,0 +1,314 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUpTable() async { + return await r.table(tableName!).insert([ + { + 'id': 1, + 'name': 'Jane Doe', + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ] + }, + { + 'id': 2, + 'name': 'Jon Doe', + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'nickname': 'Jo' + }, + {'id': 3, 'name': 'Firstname Last'} + ]).run(connection!); + } + + setUp(() async { + connection = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + connection!.use(testDbName!); + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName!).run(connection!); + } + await setUpTable(); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + connection!.close(); + } else { + connection!.close(); + } + }); + + group("pluck command -> ", () { + test( + "should use pluck and return the children of the person with the id equal to 1", + () async { + var parent = + await r.table(tableName!).get(1).pluck('children').run(connection!); + expect(parent is Map, equals(true)); + + expect( + parent['children'], + equals([ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ])); + }); + test( + "should use pluck and return the children and the name of the person with the id equal to 1", + () async { + var parent = await r + .table(tableName!) + .get(1) + .pluck('children', 'name') + .run(connection!); + expect(parent is Map, equals(true)); + + expect( + parent, + equals({ + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'name': 'Jane Doe' + })); + }); + test("should use pluck and return the children of the people", () async { + Cursor parents = + await r.table(tableName!).pluck('children').run(connection!); + //expect(parents is Cursor, equals(true)); + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect( + parentsList[2]['children'], + equals([ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ])); + expect( + parentsList[1]['children'], + equals([ + {'id': 1, 'name': 'Louis'} + ])); + expect(parentsList[0]['children'], equals(null)); + }); + test("should use pluck and return the children and the name of the people", + () async { + Cursor parents = + await r.table(tableName!).pluck('children', 'name').run(connection!); + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect( + parentsList[2], + equals({ + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'name': 'Jane Doe' + })); + expect( + parentsList[1], + equals({ + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'name': 'Jon Doe' + })); + expect(parentsList[0], equals({'name': 'Firstname Last'})); + }); + // TODO: add the nested objects test (without the shorthand). + test( + "should use pluck with the shorthand and return the children of the people who has child/children with id and name", + () async { + Cursor parents = await r.table(tableName!).pluck({ + 'children': ['id', 'name'] + }).run(connection!); + + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect( + parentsList[2]['children'], + equals([ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ])); + expect( + parentsList[1]['children'], + equals([ + {'id': 1, 'name': 'Louis'} + ])); + expect(parentsList[0]['children'], equals(null)); + }); + test( + "should use pluck with the shorthand and return the children and the name of the people who has child/children with id and name", + () async { + Cursor parents = await r.table(tableName!).pluck({ + 'children': ['id', 'name'] + }, 'name').run(connection!); + + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect( + parentsList[2], + equals({ + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'name': 'Jane Doe' + })); + expect( + parentsList[1], + equals({ + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'name': 'Jon Doe' + })); + expect(parentsList[0], equals({'name': 'Firstname Last'})); + }); + // TODO: add tests with r.args. + }); + + group("without command -> ", () { + test( + "should use without and return the person with the id equal to 1 without the children data", + () async { + var parent = + await r.table(tableName!).get(1).without('children').run(connection!); + expect(parent is Map, equals(true)); + + expect(parent, equals({'id': 1, 'name': 'Jane Doe'})); + }); + test( + "should use without and return the person with the id equal to 1 without the children data and the name", + () async { + var parent = await r + .table(tableName!) + .get(1) + .without('children', 'name') + .run(connection!); + expect(parent is Map, equals(true)); + + expect(parent, equals({'id': 1})); + }); + test("should use without and return the people without the children data", + () async { + Cursor parents = + await r.table(tableName!).without('children').run(connection!); + + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect(parentsList[2], equals({'id': 1, 'name': 'Jane Doe'})); + expect(parentsList[1], + equals({'id': 2, 'name': 'Jon Doe', 'nickname': 'Jo'})); + expect(parentsList[0], equals({'id': 3, 'name': 'Firstname Last'})); + }); + test( + "should use without and return the people without the children data and the name", + () async { + Cursor parents = await r + .table(tableName!) + .without('children', 'name') + .run(connection!); + + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect(parentsList[2], equals({'id': 1})); + expect(parentsList[1], equals({'id': 2, 'nickname': 'Jo'})); + expect(parentsList[0], equals({'id': 3})); + }); + // TODO: add the nested objects test (without the shorthand). + test( + "should use without with the shorthand and return the people who has child/children with id and name, but without them", + () async { + Cursor parents = await r.table(tableName!).without({ + 'children': ['id', 'name'] + }).run(connection!); + + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect( + parentsList[2], + equals({ + 'children': [{}, {}], + 'id': 1, + 'name': 'Jane Doe' + })); + expect( + parentsList[1], + equals({ + 'children': [{}], + 'id': 2, + 'name': 'Jon Doe', + 'nickname': 'Jo' + })); + expect(parentsList[0], equals({'id': 3, 'name': 'Firstname Last'})); + }); + test( + "should use without with the shorthand and return the people who has child/children with id and name, but without them and the person name", + () async { + Cursor parents = await r.table(tableName!).without({ + 'children': ['id', 'name'] + }, 'name').run(connection!); + + List parentsList = await parents.toList(); + + expect(parentsList.length, equals(3)); + + expect( + parentsList[2], + equals({ + 'children': [{}, {}], + 'id': 1 + })); + expect( + parentsList[1], + equals({ + 'children': [{}], + 'id': 2, + 'nickname': 'Jo' + })); + expect(parentsList[0], equals({'id': 3})); + }); + // TODO: add tests with r.args. + }); + + // TODO: add rqlDo tests. + // TODO: add rqlDefault tests. + // TODO: add replace tests. + // TODO: add delete tests. +} diff --git a/drivers/rethinkdb/test/rql_type_inspection_test.dart b/drivers/rethinkdb/test/rql_type_inspection_test.dart new file mode 100644 index 0000000..2eeae56 --- /dev/null +++ b/drivers/rethinkdb/test/rql_type_inspection_test.dart @@ -0,0 +1,379 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUpTable() async { + return await r.table(tableName!).insert([ + { + 'id': 1, + 'name': 'Jane Doe', + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ] + }, + { + 'id': 2, + 'name': 'Jon Doe', + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'nickname': 'Jo' + }, + {'id': 3, 'name': 'Firstname Last'} + ]).run(connection!); + } + + setUp(() async { + connection = await r.connect(); + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + connection!.use(testDbName!); + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName!).run(connection!); + } + await setUpTable(); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + connection!.close(); + } else { + connection!.close(); + } + }); + + // TODO: add coerceTo tests. + // TODO: add ungroup tests. + // TODO: add typeOf tests. + + group("merge command -> ", () { + test("should merge person with id equal to 1 and person with id equal to 3", + () async { + var result = await r + .table(tableName!) + .get(1) + .merge(r.table(tableName!).get(3)) + .run(connection!); + expect(result is Map, equals(true)); + + expect( + result, + equals({ + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 3, + 'name': 'Firstname Last' + })); + }); + test( + "should merge person with id equal to 2 and person with id equal to 1 and person with id equal to 3", + () async { + var result = await r + .table(tableName!) + .get(2) + .merge(r.table(tableName!).get(1)) + .merge(r.table(tableName!).get(3)) + .run(connection!); + expect(result is Map, equals(true)); + + expect( + result, + equals({ + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 3, + 'name': 'Firstname Last', + 'nickname': 'Jo' + })); + }); + test( + "should merge person with id equal to 2 and person with id equal to 1 and person with id equal to 3 together", + () async { + var result = await r + .table(tableName!) + .get(2) + .merge(r.table(tableName!).get(1), r.table(tableName!).get(3)) + .run(connection!); + expect(result is Map, equals(true)); + + expect( + result, + equals({ + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 3, + 'name': 'Firstname Last', + 'nickname': 'Jo' + })); + }); + // TODO: add tests with r.args. + }); + + // TODO: add append tests. + // TODO: add floor tests. + // TODO: add ceil tests. + // TODO: add round tests. + // TODO: add prepend tests. + // TODO: add difference tests. + // TODO: add setInsert tests. + // TODO: add setUnion tests. + // TODO: add setIntersection tests. + // TODO: add setDifference tests. + // TODO: add getField tests. + // TODO: add nth tests. + // TODO: add match tests. + // TODO: add split tests. + // TODO: add upcase tests. + // TODO: add downcase tests. + // TODO: add isEmpty tests. + // TODO: add slice tests. + // TODO: add fold tests. + // TODO: add skip tests. + // TODO: add limit tests. + // TODO: add reduce tests. + // TODO: add sum tests. + // TODO: add avg tests. + // TODO: add min tests. + // TODO: add max tests. + // TODO: add map tests. + // TODO: add filter tests. + // TODO: add concatMap tests. + // TODO: add get tests. + // TODO: add orderBy tests. + // TODO: add between tests. + // TODO: add distinct tests. + // TODO: add count tests. + + group("union command -> ", () { + test("should union people with one new person", () async { + var result = await r.table(tableName!).union({ + 'id': 4, + 'name': 'Additional 1', + 'hobbies': ["swim"] + }).run(connection!); + expect(result is Cursor, equals(true)); + List peopleList = await result.toList(); + + expect(peopleList.length, equals(4)); + + expect( + peopleList, + equals([ + { + 'hobbies': ['swim'], + 'id': 4, + 'name': 'Additional 1' + }, + {'id': 3, 'name': 'Firstname Last'}, + { + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'id': 2, + 'name': 'Jon Doe', + 'nickname': 'Jo' + }, + { + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 1, + 'name': 'Jane Doe' + } + ])); + }); + test("should union people with two new people", () async { + var result = await r.table(tableName!).union({ + 'id': 4, + 'name': 'Additional 1', + 'hobbies': ["swim"] + }).union({'id': 5, 'name': 'Additional 2', 'wife': 'Judith'}).run( + connection); + expect(result is Cursor, equals(true)); + List peopleList = await result.toList(); + + expect(peopleList.length, equals(5)); + + expect( + peopleList, + equals([ + {'id': 5, 'name': 'Additional 2', 'wife': 'Judith'}, + { + 'hobbies': ['swim'], + 'id': 4, + 'name': 'Additional 1' + }, + {'id': 3, 'name': 'Firstname Last'}, + { + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'id': 2, + 'name': 'Jon Doe', + 'nickname': 'Jo' + }, + { + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 1, + 'name': 'Jane Doe' + } + ])); + }); + test("should union people with two new people together", () async { + var result = await r.table(tableName!).union({ + 'id': 4, + 'name': 'Additional 1', + 'hobbies': ["swim"] + }, { + 'id': 5, + 'name': 'Additional 2', + 'wife': 'Judith' + }).run(connection!); + expect(result is Cursor, equals(true)); + List peopleList = await result.toList(); + + expect(peopleList.length, equals(5)); + + expect( + peopleList, + equals([ + { + 'hobbies': ['swim'], + 'id': 4, + 'name': 'Additional 1' + }, + {'id': 5, 'name': 'Additional 2', 'wife': 'Judith'}, + {'id': 3, 'name': 'Firstname Last'}, + { + 'children': [ + {'id': 1, 'name': 'Louis'} + ], + 'id': 2, + 'name': 'Jon Doe', + 'nickname': 'Jo' + }, + { + 'children': [ + {'id': 1, 'name': 'Robert'}, + {'id': 2, 'name': 'Mariah'} + ], + 'id': 1, + 'name': 'Jane Doe' + } + ])); + }); + // TODO: add tests with r.args. + // TODO: add tests with interleave. + }); + + // TODO: add innerJoin tests. + // TODO: add outerJoin tests. + // TODO: add eqJoin tests. + // TODO: add zip tests. + + group("group command -> ", () { + var groupCommandData = [ + {'id': 2, 'player': "Bob", 'points': 15, 'type': "ranked"}, + {'id': 5, 'player': "Alice", 'points': 7, 'type': "free"}, + {'id': 11, 'player': "Bob", 'points': 10, 'type': "free"}, + {'id': 12, 'player': "Alice", 'points': 2, 'type': "free"} + ]; + test("should group by player", () async { + var result = + await r.expr(groupCommandData).group('player').run(connection!); + expect(result is Map, equals(true)); + + expect(result.length, equals(2)); + + expect( + result, + equals({ + 'Alice': [ + {'id': 5, 'player': 'Alice', 'points': 7, 'type': 'free'}, + {'id': 12, 'player': 'Alice', 'points': 2, 'type': 'free'} + ], + 'Bob': [ + {'id': 2, 'player': 'Bob', 'points': 15, 'type': 'ranked'}, + {'id': 11, 'player': 'Bob', 'points': 10, 'type': 'free'} + ] + })); + }); + test("should group by player and type together", () async { + var result = await r + .expr(groupCommandData) + .group('player', 'type') + .run(connection!); + expect(result is Map, equals(true)); + + expect(result.length, equals(3)); + + result.forEach((key, value) { + expect(key is List, equals(true)); + expect(key.length, equals(2)); + switch (key[0]) { + case 'Alice': + expect(key[1], 'free'); + expect( + value, + equals([ + {'id': 5, 'player': 'Alice', 'points': 7, 'type': 'free'}, + {'id': 12, 'player': 'Alice', 'points': 2, 'type': 'free'} + ])); + break; + case 'Bob': + switch (key[1]) { + case 'free': + expect( + value, + equals([ + {'id': 11, 'player': 'Bob', 'points': 10, 'type': 'free'} + ])); + break; + case 'ranked': + expect( + value, + equals([ + {'id': 2, 'player': 'Bob', 'points': 15, 'type': 'ranked'} + ])); + break; + default: + fail('invalid key'); + } + break; + default: + fail('invalid key'); + } + }); + }); + // TODO: add tests with r.args. + // TODO: add tests with index. + // TODO: add tests with multi. + }); + + // TODO: add forEach tests. + // TODO: add info tests. +} diff --git a/drivers/rethinkdb/test/selectingdata_test.dart b/drivers/rethinkdb/test/selectingdata_test.dart new file mode 100644 index 0000000..14e74ab --- /dev/null +++ b/drivers/rethinkdb/test/selectingdata_test.dart @@ -0,0 +1,182 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + var r = RethinkDb() as dynamic; + + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + + if (testDbName == null) { + String useDb = await r.uuid().run(connection); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName).run(connection); + } + connection!.use(testDbName!); + + if (tableName == null) { + String tblName = await r.uuid().run(connection); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName).run(connection); + } + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName).run(connection); + connection!.close(); + } else { + connection!.close(); + } + }); + + setUpTable() async { + return await r.table(tableName).insert([ + {'id': 1, 'name': 'Jane Doe'}, + {'id': 2, 'name': 'Jon Doe'}, + {'id': 3, 'name': 'Firstname Last'} + ]).run(connection); + } + + group("get command -> ", () { + test("should get a record by primary key", () async { + await setUpTable(); + var usr = await r.table(tableName).get(1).run(connection); + + expect(usr['id'], equals(1)); + expect(usr['name'], equals('Jane Doe')); + }); + }); + + group("getAll command -> ", () { + test("should get records by primary keys", () async { + Cursor usrs = await r.table(tableName).getAll(1, 3).run(connection); + + List userList = await usrs.toList(); + + expect(userList[1]['id'], equals(1)); + expect(userList[0]['id'], equals(3)); + + expect(userList[1]['name'], equals('Jane Doe')); + expect(userList[0]['name'], equals('Firstname Last')); + }); + }); + + group("between command -> ", () { + test("should get records between keys defaulting to closed left bound", + () async { + Cursor usrs = await r.table(tableName).between(1, 3).run(connection); + + List userList = await usrs.toList(); + + expect(userList.length, equals(2)); + expect(userList[1]['id'], equals(1)); + expect(userList[0]['id'], equals(2)); + + expect(userList[1]['name'], equals('Jane Doe')); + expect(userList[0]['name'], equals('Jon Doe')); + }); + + test("should get records between keys with closed right bound", () async { + Cursor usrs = await r + .table(tableName) + .between(1, 3, {'right_bound': 'closed'}).run(connection); + + List userList = await usrs.toList(); + + expect(userList.length, equals(3)); + expect(userList[2]['id'], equals(1)); + expect(userList[0]['id'], equals(3)); + + expect(userList[2]['name'], equals('Jane Doe')); + expect(userList[0]['name'], equals('Firstname Last')); + }); + + test("should get records between keys with open left bound", () async { + Cursor usrs = await r + .table(tableName) + .between(1, 3, {'left_bound': 'open'}).run(connection); + List userList = await usrs.toList(); + + expect(userList.length, equals(1)); + expect(userList[0]['id'], equals(2)); + + expect(userList[0]['name'], equals('Jon Doe')); + }); + + test("should get records with a value less than minval", () async { + Cursor usrs = + await r.table(tableName).between(r.minval, 2).run(connection); + + List userList = await usrs.toList(); + + expect(userList.length, equals(1)); + expect(userList[0]['id'], equals(1)); + + expect(userList[0]['name'], equals('Jane Doe')); + }); + + test("should get records with a value greater than maxval", () async { + Cursor usrs = + await r.table(tableName).between(2, r.maxval).run(connection); + + List userList = await usrs.toList(); + + expect(userList.length, equals(2)); + expect(userList[0]['id'], equals(3)); + + expect(userList[0]['name'], equals('Firstname Last')); + }); + }); + + group("filter command -> ", () { + test("should filter by field", () async { + Cursor users = + await r.table(tableName).filter({'name': 'Jane Doe'}).run(connection); + + List userList = await users.toList(); + + expect(userList.length, equals(1)); + expect(userList[0]['id'], equals(1)); + expect(userList[0]['name'], equals('Jane Doe')); + }); + + test("should filter with r.row", () async { + Cursor users = await r + .table(tableName) + .filter(r.row('name').match("Doe")) + .run(connection); + + List userList = await users.toList(); + + expect(userList.length, equals(2)); + expect(userList[0]['id'], equals(2)); + expect(userList[0]['name'], equals('Jon Doe')); + }); + + test("should filter with a function", () async { + Cursor users = await r.table(tableName).filter((user) { + return user('name').eq("Jon Doe").or(user('name').eq("Firstname Last")); + }).run(connection); + + List userList = await users.toList(); + + expect(userList.length, equals(2)); + expect(userList[0]['id'], equals(3)); + expect(userList[0]['name'], equals('Firstname Last')); + }); + }); + + test("remove the test database", () async { + Map response = await r.dbDrop(testDbName).run(connection); + + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + }); +} diff --git a/drivers/rethinkdb/test/tablemethods_test.dart b/drivers/rethinkdb/test/tablemethods_test.dart new file mode 100644 index 0000000..973ec1b --- /dev/null +++ b/drivers/rethinkdb/test/tablemethods_test.dart @@ -0,0 +1,275 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + var r = RethinkDb() as dynamic; + String? databaseName; + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + + if (testDbName == null) { + String useDb = await r.uuid().run(connection); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName).run(connection); + } + + if (databaseName == null) { + String dbName = await r.uuid().run(connection); + databaseName = "test_database_${dbName.replaceAll("-", "")}"; + } + + if (tableName == null) { + String tblName = await r.uuid().run(connection); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + } + connection!.use(testDbName!); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName).run(connection); + connection!.close(); + } else { + connection!.close(); + } + }); + + group("tableCreate command -> ", () { + test("should create a table given a name", () async { + Map createdTable = await r.tableCreate('unitTestTable').run(connection); + + expect(createdTable['config_changes'] is List, equals(true)); + expect(createdTable['tables_created'], equals(1)); + Map newTable = createdTable['config_changes'][0]['new_val']; + expect(newTable['name'], equals('unitTestTable')); + expect(createdTable['config_changes'][0]['old_val'], equals(null)); + }); + test( + "should throw an `ReqlOpFailedError` if a table with the same name exists", + () async { + try { + await r.tableCreate('unitTestTable').run(connection); + } catch (err) { + expect(err.runtimeType, equals(ReqlOpFailedError)); + } + }); + test("should allow user to specify primary_key", () async { + Map createdTable = await r.tableCreate( + 'unitTestTable1', {'primary_key': 'userID'}).run(connection); + + expect(createdTable['config_changes'] is List, equals(true)); + expect(createdTable['tables_created'], equals(1)); + Map newTable = createdTable['config_changes'][0]['new_val']; + expect(newTable['name'], equals('unitTestTable1')); + expect(createdTable['config_changes'][0]['old_val'], equals(null)); + expect(newTable['primary_key'], equals('userID')); + }); + + test("should allow user to specify durability", () async { + Map createdTable = await r.tableCreate( + 'unitTestTable2', {'durability': 'soft'}).run(connection); + + expect(createdTable['config_changes'] is List, equals(true)); + expect(createdTable['tables_created'], equals(1)); + Map newTable = createdTable['config_changes'][0]['new_val']; + expect(newTable['name'], equals('unitTestTable2')); + expect(createdTable['config_changes'][0]['old_val'], equals(null)); + expect(newTable['durability'], equals('soft')); + }); + test("should allow user to specify number of shards", () async { + Map createdTable = + await r.tableCreate('unitTestTable3', {'shards': 64}).run(connection); + + expect(createdTable['config_changes'] is List, equals(true)); + expect(createdTable['tables_created'], equals(1)); + Map newTable = createdTable['config_changes'][0]['new_val']; + expect(newTable['name'], equals('unitTestTable3')); + expect(createdTable['config_changes'][0]['old_val'], equals(null)); + expect(newTable['shards'].length, equals(64)); + }); + + test("should allow user to specify replicas", () async { + Map createdTable = + await r.tableCreate('unitTestTable4', {'shards': 3}).run(connection); + + expect(createdTable['config_changes'] is List, equals(true)); + expect(createdTable['tables_created'], equals(1)); + Map newTable = createdTable['config_changes'][0]['new_val']; + expect(newTable['name'], equals('unitTestTable4')); + expect(createdTable['config_changes'][0]['old_val'], equals(null)); + expect(newTable['shards'].length, equals(3)); + }); + }); + + group("tableDrop command -> ", () { + test("should drop an existing table", () async { + Map droppedTable = await r.tableDrop('unitTestTable4').run(connection); + + expect(droppedTable['tables_dropped'], equals(1)); + Map oldTable = droppedTable['config_changes'][0]['old_val']; + expect(oldTable['name'], equals('unitTestTable4')); + }); + + test("should throw an `ReqlOpFailedError` if the table doesn't exist", + () async { + try { + await r.tableDrop('unitTestTable4').run(connection); + } catch (err) { + expect(err is ReqlOpFailedError, equals(true)); + } + }); + }); + + test("tableList command -> should list all tables for the database", + () async { + List tables = await r.tableList().run(connection); + + expect(tables.length, equals(4)); + }); + + group("indexCreate command -> ", () { + test("should create a new secondary index", () async { + Map ind = + await r.table('unitTestTable3').indexCreate('index1').run(connection); + expect(ind['created'], equals(1)); + }); + + test("should create a new geo index", () async { + Map ind = await r + .table('unitTestTable3') + .indexCreate('location', {'geo': true}).run(connection); + + expect(ind['created'], equals(1)); + }); + + test("should create a new index with an index function", () async { + Map ind = await r + .table('unitTestTable3') + .indexCreate('surname', (name) => name('last')) + .run(connection); + + expect(ind['created'], equals(1)); + }); + + test("should create a new index with an index function as `r.row`", + () async { + Map ind = await r + .table('unitTestTable3') + .indexCreate('surname1', r.row('name')('last')) + .run(connection); + + expect(ind['created'], equals(1)); + }); + + test("should create a new compound index with an index", () async { + Map ind = await r.table('unitTestTable3').indexCreate( + 'surnameAndBirthYear', + [r.row('name')('last'), r.row('birthday')('year')]).run(connection); + + expect(ind['created'], equals(1)); + }); + + test("should create multi index", () async { + Map ind = await r + .table('unitTestTable3') + .indexCreate('author', {'multi': true}).run(connection); + + expect(ind['created'], equals(1)); + }); + }); + + test("indexDrop command -> remove an index", () async { + Map ind = + await r.table('unitTestTable3').indexDrop('author').run(connection); + expect(ind['dropped'], equals(1)); + }); + + test("indexList command -> should list all indexes for a table", () async { + List indexes = await r.table('unitTestTable3').indexList().run(connection); + + expect(indexes.length, equals(5)); + }); + + group("indexRename command -> ", () { + test("should rename an index", () async { + Map renamedIndex = await r + .table('unitTestTable3') + .indexRename('surname', 'lastName') + .run(connection); + expect(renamedIndex['renamed'], equals(1)); + }); + + test("should overwrite existing index if overwrite = true", () async { + Map renamedIndex = await r.table('unitTestTable3').indexRename( + 'lastName', 'location', {'overwrite': true}).run(connection); + + expect(renamedIndex['renamed'], equals(1)); + }); + }); + + group("indexStatus command -> ", () { + test("should return status of all indexes", () async { + List indexes = + await r.table('unitTestTable3').indexStatus().run(connection); + + expect(indexes.length, equals(4)); + }); + + test("should return status of a single index", () async { + List indexes = + await r.table('unitTestTable3').indexStatus('index1').run(connection); + + expect(indexes.length, equals(1)); + expect(indexes[0]['index'], equals('index1')); + }); + + test("should return any number of indexes", () async { + List indexes = await r + .table('unitTestTable3') + .indexStatus('index1', 'location') + .run(connection); + + expect(indexes.length, equals(2)); + expect(indexes[0]['index'], equals('index1')); + }); + }); + + group("indexWait command -> ", () { + test("should wait for all indexes if none are specified", () async { + List response = + await r.table('unitTestTable3').indexWait().run(connection); + + expect(response.length, equals(4)); + }); + + test("should wait for a specified index", () async { + List response = + await r.table('unitTestTable3').indexWait('index1').run(connection); + + expect(response.length, equals(1)); + expect(response[0]['index'], equals('index1')); + }); + + test("should wait for any number of indexes", () async { + List indexes = await r + .table('unitTestTable3') + .indexWait('index1', 'location') + .run(connection); + + expect(indexes.length, equals(2)); + expect(indexes[0]['index'], equals('index1')); + }); + }); + + test("remove the test database", () async { + Map response = await r.dbDrop(testDbName).run(connection); + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + }); +} diff --git a/drivers/rethinkdb/test/toplevelqueries_test.dart b/drivers/rethinkdb/test/toplevelqueries_test.dart new file mode 100644 index 0000000..b13ba06 --- /dev/null +++ b/drivers/rethinkdb/test/toplevelqueries_test.dart @@ -0,0 +1,682 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + var r = RethinkDb() as dynamic; + String? databaseName; + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + + if (testDbName == null) { + String useDb = await r.uuid().run(connection); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName).run(connection); + } + + if (databaseName == null) { + String dbName = await r.uuid().run(connection); + databaseName = "test_database_${dbName.replaceAll("-", "")}"; + } + + if (tableName == null) { + String tblName = await r.uuid().run(connection); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + } + connection!.use(testDbName!); + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName).run(connection); + } + connection!.close(); + }); + + test("r.db throws an error if a bad database name is given", () async { + try { + await r.db('fake2834723895').tableList().run(connection); + } catch (err) { + expect(err is Exception, equals(true)); + expect(err.toString().split("\n")[2], + equals('Database `fake2834723895` does not exist.')); + } + }); + + group("dbCreate command -> ", () { + test("r.dbCreate will create a new database", () async { + Map response = await r.dbCreate(databaseName).run(connection); + + expect(response.keys.length, equals(2)); + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_created'], equals(1)); + + Map configChanges = response['config_changes'][0]; + expect(configChanges.keys.length, equals(2)); + expect(configChanges['old_val'], equals(null)); + Map newVal = configChanges['new_val']; + expect(newVal.containsKey('id'), equals(true)); + expect(newVal.containsKey('name'), equals(true)); + expect(newVal['name'], equals(databaseName)); + }); + + test("r.dbCreate will throw an error if the database exists", () async { + try { + await r.dbCreate(databaseName).run(connection); + } catch (err) { + expect(err is Exception, equals(true)); + expect( + err.toString().split("\n")[2], + // ignore: unnecessary_brace_in_string_interps + equals('Database `${databaseName}` already exists.')); + } + }); + }); + + group("dbDrop command -> ", () { + test("r.dbDrop should drop a database", () async { + Map response = await r.dbDrop(databaseName).run(connection); + + expect(response.keys.length, equals(3)); + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + expect(response['tables_dropped'], equals(0)); + + Map configChanges = response['config_changes'][0]; + expect(configChanges.keys.length, equals(2)); + expect(configChanges['new_val'], equals(null)); + Map oldVal = configChanges['old_val']; + expect(oldVal.containsKey('id'), equals(true)); + expect(oldVal.containsKey('name'), equals(true)); + expect(oldVal['name'], equals(databaseName)); + }); + + test("r.dbDrop should error if the database does not exist", () async { + try { + await r.dbDrop(databaseName).run(connection); + } catch (err) { + expect(err.toString().split("\n")[2], + equals('Database `$databaseName` does not exist.')); + } + }); + }); + + test("r.dbList should list all databases", () async { + List response = await r.dbList().run(connection); + + expect(response.indexOf('rethinkdb'), greaterThan(-1)); + }); + + group("range command -> ", () { + test("r.range() with no arguments should return a stream", () async { + Cursor cur = await r.range().run(connection); + + List item = await cur.take(17).toList(); + expect(item, + equals([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16])); + }); + + test("r.range() should accept a single end arguement", () async { + Cursor cur = await r.range(10).run(connection); + + List l = await cur.toList(); + expect(l, equals([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])); + }); + + test("r.range() should accept a start and end arguement", () async { + Cursor cur = await r.range(7, 10).run(connection); + List l = await cur.toList(); + expect(l, equals([7, 8, 9])); + }); + }); + + group("table command -> ", () { + test("table should return a cursor containing all records for a table", + () async { + Cursor cur = await r.db('rethinkdb').table('stats').run(connection); + await for (Map item in cur) { + expect(item.containsKey('id'), equals(true)); + expect(item.containsKey('query_engine'), equals(true)); + } + }); + + test("table should allow for `read_mode: single` option", () async { + Cursor cur = await r + .db('rethinkdb') + .table('stats', {'read_mode': 'single'}).run(connection); + + await for (Map item in cur) { + expect(item.containsKey('id'), equals(true)); + expect(item.containsKey('query_engine'), equals(true)); + } + }); + + test("table should allow for `read_mode: majority` option", () async { + Cursor cur = await r + .db('rethinkdb') + .table('stats', {'read_mode': 'majority'}).run(connection); + + await for (Map item in cur) { + expect(item.containsKey('id'), equals(true)); + expect(item.containsKey('query_engine'), equals(true)); + } + }); + + test("table should allow for `read_mode: outdated` option", () async { + Cursor cur = await r + .db('rethinkdb') + .table('stats', {'read_mode': 'outdated'}).run(connection); + + await for (Map item in cur) { + expect(item.containsKey('id'), equals(true)); + expect(item.containsKey('query_engine'), equals(true)); + } + }); + + test("table should catch invalid read_mode option", () async { + try { + await r + .db('rethinkdb') + .table('stats', {'read_mode': 'badReadMode'}).run(connection); + } catch (err) { + expect( + err.toString().split("\n")[2], + equals( + 'Read mode `badReadMode` unrecognized (options are "majority", "single", and "outdated").')); + } + }); + + test("table should allow for `identifier_format: name` option", () async { + Cursor cur = await r + .db('rethinkdb') + .table('stats', {'identifier_format': 'name'}).run(connection); + + await for (Map item in cur) { + expect(item.containsKey('id'), equals(true)); + expect(item.containsKey('query_engine'), equals(true)); + } + }); + + test("table should allow for `identifier_format: uuid` option", () async { + Cursor cur = await r + .db('rethinkdb') + .table('stats', {'identifier_format': 'uuid'}).run(connection); + + await for (Map item in cur) { + expect(item.containsKey('id'), equals(true)); + expect(item.containsKey('query_engine'), equals(true)); + } + }); + + test("table should catch invalid identifier_format option", () async { + try { + await r + .db('rethinkdb') + .table('stats', {'identifier_format': 'badFormat'}).run(connection); + } catch (err) { + expect( + err.toString().split("\n")[2], + equals( + 'Identifier format `badFormat` unrecognized (options are "name" and "uuid").')); + } + }); + + test("table should catch bad options", () async { + try { + await r + .db('rethinkdb') + .table('stats', {'fake_option': 'bad_value'}).run(connection); + } catch (err) { + expect(err.toString().split("\n")[2], + equals('Unrecognized optional argument `fake_option`.')); + } + }); + }); + + group("time command -> ", () { + test( + "should return a time object if given a year, month, day, and timezone", + () async { + DateTime obj = await r.time(2010, 12, 29, timezone: 'Z').run(connection); + + expect(obj.runtimeType, equals(DateTime)); + expect(obj.isBefore(DateTime.now()), equals(true)); + expect(obj.minute, equals(0)); + expect(obj.second, equals(0)); + }); + + test( + "should return a time object if given a year, month, day, hour, minute, second, and timezone", + () async { + DateTime obj = await r + .time(2010, 12, 29, hour: 7, minute: 33, second: 45, timezone: 'Z') + .run(connection); + + expect(obj.runtimeType, equals(DateTime)); + expect(obj.isBefore(DateTime.now()), equals(true)); + expect(obj.minute, equals(33)); + expect(obj.second, equals(45)); + }); + }); + + test( + "nativeTime command -> should turn a native dart DateTime to a reql time", + () async { + DateTime dt = DateTime.now(); + DateTime rqlDt = await r.nativeTime(dt).run(connection); + + expect(dt.year, equals(rqlDt.year)); + expect(dt.month, equals(rqlDt.month)); + expect(dt.day, equals(rqlDt.day)); + expect(dt.hour, equals(rqlDt.hour)); + expect(dt.minute, equals(rqlDt.minute)); + expect(dt.second, equals(rqlDt.second)); + }); + + group("ISO8601 command -> ", () { + test("should take an ISO8601 string and convert it to a DateTime object", + () async { + DateTime dt = + await r.ISO8601('1986-11-03T08:30:00-07:00').run(connection); + + expect(dt.year, equals(1986)); + expect(dt.month, equals(11)); + expect(dt.day, equals(3)); + expect(dt.minute, equals(30)); + }); + + test("should accept a timezone argument as well", () async { + DateTime dt = + await r.ISO8601('1986-11-03T08:30:00-07:00', 'MST').run(connection); + + expect(dt.year, equals(1986)); + expect(dt.month, equals(11)); + expect(dt.day, equals(3)); + expect(dt.minute, equals(30)); + }); + }); + + test("epochTime command -> should take a timestamp and return a time object", + () async { + DateTime dateTime = DateTime.fromMillisecondsSinceEpoch(531360000000); + + DateTime dt = await r.epochTime(531360000).run(connection); + + expect(dt.month, equals(dateTime.month)); + expect(dt.day, equals(dateTime.day)); + expect(dt.hour, equals(dateTime.hour)); + expect(dt.minute, equals(dateTime.minute)); + expect(dt.second, equals(dateTime.second)); + }); + + test("now command -> should return current DateTime object", () async { + DateTime dt = await r.now().run(connection); + + await Future.delayed(Duration(milliseconds: 1)); + + //expect(dt is DateTime, equals(true)); + expect(DateTime.now().difference(dt).inSeconds == 0, equals(true)); + }); + + group("rqlDo command -> ", () { + test("should accept a single argument and function", () async { + bool i = await r.rqlDo(3, (item) => item > 4).run(connection); + + expect(i, equals(false)); + }); + + test("should accept a many arguments and a function", () async { + num i = await r + .rqlDo( + 3, + 4, + 5, + 6, + 7, + (item1, item2, item3, item4, item5) => + item1 + item2 + item3 + item4 + item5) + .run(connection); + + expect(i, equals(25)); + }); + + test("should accept many args and an expression", () async { + Cursor cur = await r.rqlDo(3, 7, r.range).run(connection); + + List list = await cur.toList(); + expect(list, equals([3, 4, 5, 6])); + }); + }); + + group("branch command -> ", () { + test("should accept a true test and return the true branch value", + () async { + String val = await r.branch(3 < 4, 'isTrue', 'isFalse').run(connection); + + expect(val, equals('isTrue')); + }); + + test("should accept a false test and return the false branch value", + () async { + String val = await r.branch(3 > 4, 'isTrue', 'isFalse').run(connection); + + expect(val, equals('isFalse')); + }); + + test("should accept multiple tests and actions", () async { + String val = await r + .branch(1 > 4, 'isTrue', 0 < 1, 'elseTrue', 'isFalse') + .run(connection); + + expect(val, equals('elseTrue')); + }); + }); + + test("error command -> should create a custom error", () async { + try { + await r.error('This is my Error').run(connection); + } catch (err) { + expect(err.runtimeType, equals(ReqlUserError)); + expect(err.toString().split("\n")[2], equals('This is my Error')); + } + }); + + group("js command -> ", () { + test("should run custom javascript", () async { + String jsString = """ + function concatStrs(){ + return 'firstHalf' + '_' + 'secondHalf'; + } + concatStrs(); + """; + + String str = await r.js(jsString).run(connection); + + expect(str, equals('firstHalf_secondHalf')); + }); + + //TODO: fix test + + // test("should accept a timeout option", () async { + // String jsString = """ + // function concatStrs(){ + // return 'firstHalf' + '_' + 'secondHalf'; + // } + // while(true){ + // concatStrs(); + // } + // """; + // int timeout = 3; + // try { + // await r.js(jsString, {'timeout': timeout}).run(connection); + // } catch (err) { + // expect( + // err.toString(), + // equals( + // 'JavaScript query `$jsString` timed out after $timeout.000 seconds.')); + // } + // }); + }); + + group("json command -> ", () { + test("should parse a json string", () async { + String jsonString = "[1,2,3,4]"; + List obj = await r.json(jsonString).run(connection); + + expect([1, 2, 3, 4], equals(obj)); + }); + + test("should throw error if jsonString is invalid", () async { + String jsonString = "1,2,3,4]"; + try { + await r.json(jsonString).run(connection); + } catch (err) { + expect( + err.toString().split("\n")[2], + equals( + 'Failed to parse "$jsonString" as JSON: The document root must not follow by other values.')); + } + }); + }); + + group("object command -> ", () { + test("should create an object from an array of values", () async { + Map obj = await r + .object('key', 'val', 'listKey', [1, 2, 3, 4], 'objKey', {'a': 'b'}) + .run(connection); + + //expect(obj is Map, equals(true)); + expect(obj['key'], equals('val')); + expect(obj['listKey'], equals([1, 2, 3, 4])); + expect(obj['objKey']['a'], equals('b')); + }); + + test("should throw an error if params cannot be parsed into a map", + () async { + try { + await r + .object('key', 'val', 'listKey', [1, 2, 3, 4], 'objKey', {'a': 'b'}, + 'odd') + .run(connection); + } catch (err) { + expect( + err.toString().split("\n")[2], + equals( + 'OBJECT expects an even number of arguments (but found 7).')); + } + }); + }); + + test("args command -> should accept an array", () async { + List l = await r.args([1, 2]).run(connection); + + expect(l, equals([1, 2])); + }); + + group("random command -> ", () { + test("should generate a random number if no parameters are provided", + () async { + double number = await r.random().run(connection); + + expect(number, lessThanOrEqualTo(1)); + expect(number, greaterThanOrEqualTo(0)); + }); + + test( + "should generate a positive random int no greater than the single argument", + () async { + int number = await r.random(50).run(connection); + + expect(number, lessThanOrEqualTo(50)); + expect(number, greaterThanOrEqualTo(0)); + }); + + test("should generate a random int between the two arguments", () async { + int number = await r.random(50, 55).run(connection); + + expect(number, lessThanOrEqualTo(55)); + expect(number, greaterThanOrEqualTo(50)); + }); + + test("should generate a random float between the two arguments", () async { + double number = await r.random(50, 55, {'float': true}).run(connection); + expect(number, lessThanOrEqualTo(55)); + expect(number, greaterThanOrEqualTo(50)); + }); + }); + + group("not command -> ", () { + test("should return false if given no arguements", () async { + bool val = await r.not().run(connection); + + expect(val, equals(false)); + }); + + test("should return the inverse of the argument provided", () async { + bool val = await r.not(false).run(connection); + + expect(val, equals(true)); + }); + }); + + group("map command -> ", () { + test("should map over an array", () async { + List arr = + await r.map([1, 2, 3, 4, 5], (item) => item * 2).run(connection); + expect(arr, equals([2, 4, 6, 8, 10])); + }); + + test("should map over multiple arrays", () async { + List arr = await r.map([1, 2, 3, 4, 5], [10, 9, 8, 7], + (item, item2) => item + item2).run(connection); + + //notice that the first array is longer but we + //only map the length of the shortest array + expect(arr, equals([11, 11, 11, 11])); + }); + + test("should map a sequence", () async { + List arr = await r + .map( + r.expr({ + 'key': [1, 2, 3, 4, 5] + }).getField('key'), + (item) => item + 1) + .run(connection); + + expect(arr, equals([2, 3, 4, 5, 6])); + }); + + test("should map over multiple sequences", () async { + List arr = await r + .map( + r.expr({ + 'key': [1, 2, 3, 4, 5] + }).getField('key'), + r.expr({ + 'key': [1, 2, 3, 4, 5] + }).getField('key'), + (item, item2) => item + item2) + .run(connection); + + expect(arr, equals([2, 4, 6, 8, 10])); + }); + }); + + group("and command -> ", () { + test("should and two values together", () async { + bool val = await r.and(true, true).run(connection); + + expect(val, equals(true)); + }); + test("should and more than two values together", () async { + bool val = await r.and(true, true, false).run(connection); + + expect(val, equals(false)); + }); + }); + + group("or command -> ", () { + test("should or two values together", () async { + bool val = await r.or(true, false).run(connection); + + expect(val, equals(true)); + }); + + test("should and more than two values together", () async { + bool val = await r.or(false, false, false).run(connection); + + expect(val, equals(false)); + }); + }); + + group("binary command -> ", () { + test("should convert string to binary", () async { + List data = await r.binary('billysometimes').run(connection); + + expect( + data, + equals([ + 98, + 105, + 108, + 108, + 121, + 115, + 111, + 109, + 101, + 116, + 105, + 109, + 101, + 115 + ])); + }); + }); + + group("uuid command -> ", () { + test("should create a unique uuid", () async { + String val = await r.uuid().run(connection); + + expect(val, isNotNull); + }); + + test("should create a uuid based on a string key", () async { + String key = "billysometimes"; + String val = await r.uuid(key).run(connection); + + expect(val, equals('b3f5029e-f777-572f-a85d-5529b74fd99b')); + }); + }); + + group("expr command -> ", () { + test("expr should convert native string to rql string", () async { + String str = await r.expr('string').run(connection); + expect(str, equals('string')); + }); + + test("expr should convert native int to rql int", () async { + int str = await r.expr(3).run(connection); + expect(str, equals(3)); + }); + test("expr should convert native double to rql float", () async { + double str = await r.expr(3.14).run(connection); + expect(str, equals(3.14)); + }); + test("expr should convert native bool to rql bool", () async { + bool str = await r.expr(true).run(connection); + expect(str, equals(true)); + }); + test("expr should convert native list to rql array", () async { + List str = await r.expr([1, 2, 3]).run(connection); + expect(str, equals([1, 2, 3])); + }); + test("expr should convert native object to rql object", () async { + Map str = await r.expr({'a': 'b'}).run(connection); + expect(str, equals({'a': 'b'})); + }); + }); + + test("remove the test database", () async { + Map response = await r.dbDrop(testDbName).run(connection); + + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + expect(response['tables_dropped'], equals(0)); + }); + + /// TO TEST: + /// test with orderby: r.asc(attr) + /// test with orderby: r.desc(attr) + /// r.http(url) + /// + /// test with filter or something: r.row; + /// test with time: r.monday ... r.sunday; + /// test with time: r.january .. r.december; +} diff --git a/drivers/rethinkdb/test/writingdata_test.dart b/drivers/rethinkdb/test/writingdata_test.dart new file mode 100644 index 0000000..3913157 --- /dev/null +++ b/drivers/rethinkdb/test/writingdata_test.dart @@ -0,0 +1,318 @@ +import 'package:platform_driver_rethinkdb/platform_driver_rethinkdb.dart'; +import 'package:test/test.dart'; + +main() { + RethinkDb r = RethinkDb(); + String? tableName; + String? testDbName; + bool shouldDropTable = false; + Connection? connection; + + setUp(() async { + connection = await r.connect(); + + if (testDbName == null) { + String useDb = await r.uuid().run(connection!); + testDbName = 'unit_test_db${useDb.replaceAll("-", "")}'; + await r.dbCreate(testDbName!).run(connection!); + } + connection!.use(testDbName!); + + if (tableName == null) { + String tblName = await r.uuid().run(connection!); + tableName = "test_table_${tblName.replaceAll("-", "")}"; + await r.tableCreate(tableName!).run(connection!); + } + }); + + tearDown(() async { + if (shouldDropTable) { + shouldDropTable = false; + await r.tableDrop(tableName!).run(connection!); + } + connection!.close(); + }); + + group("insert command -> ", () { + test("should insert a single record", () async { + Map createdRecord = await r + .table(tableName!) + .insert({'id': 1, 'name': 'Jane Doe'}).run(connection!); + + expect(createdRecord['deleted'], equals(0)); + expect(createdRecord['errors'], equals(0)); + expect(createdRecord['inserted'], equals(1)); + expect(createdRecord['replaced'], equals(0)); + expect(createdRecord['skipped'], equals(0)); + expect(createdRecord['unchanged'], equals(0)); + }); + + test("should error if record exists", () async { + Map createdRecord = await r + .table(tableName!) + .insert({'id': 1, 'name': 'Jane Doe'}).run(connection!); + + expect( + createdRecord['first_error'].startsWith('Duplicate primary key `id`'), + equals(true)); + expect(createdRecord['deleted'], equals(0)); + expect(createdRecord['errors'], equals(1)); + expect(createdRecord['inserted'], equals(0)); + expect(createdRecord['replaced'], equals(0)); + expect(createdRecord['skipped'], equals(0)); + expect(createdRecord['unchanged'], equals(0)); + }); + test("should insert multiple records", () async { + Map createdRecord = await r.table(tableName!).insert([ + {'name': 'Jane Doe'}, + { + 'id': 2, + 'name': 'John Bonham', + 'kit': { + 'cymbals': ['hi-hat', 'crash', 'splash'], + 'drums': ['kick', 'tom'] + } + } + ]).run(connection!); + + expect(createdRecord['deleted'], equals(0)); + expect(createdRecord['errors'], equals(0)); + expect(createdRecord['generated_keys'].length, equals(1)); + expect(createdRecord['inserted'], equals(2)); + expect(createdRecord['replaced'], equals(0)); + expect(createdRecord['skipped'], equals(0)); + expect(createdRecord['unchanged'], equals(0)); + }); + test("should change durability", () async { + Map createdRecord = await r + .table(tableName!) + .insert({'name': 'a'}, {'durability': 'hard'}).run(connection!); + + expect(createdRecord['deleted'], equals(0)); + expect(createdRecord['errors'], equals(0)); + expect(createdRecord['generated_keys'].length, equals(1)); + expect(createdRecord['inserted'], equals(1)); + expect(createdRecord['replaced'], equals(0)); + expect(createdRecord['skipped'], equals(0)); + expect(createdRecord['unchanged'], equals(0)); + }); + test("should allow return_changes", () async { + Map createdRecord = await r + .table(tableName!) + .insert({'name': 'a'}, {'return_changes': true}).run(connection!); + + expect(createdRecord['changes'][0]['new_val']['name'], equals('a')); + expect(createdRecord['changes'][0]['old_val'], equals(null)); + expect(createdRecord['deleted'], equals(0)); + expect(createdRecord['errors'], equals(0)); + expect(createdRecord['generated_keys'].length, equals(1)); + expect(createdRecord['inserted'], equals(1)); + expect(createdRecord['replaced'], equals(0)); + expect(createdRecord['skipped'], equals(0)); + expect(createdRecord['unchanged'], equals(0)); + }); + + test("should allow handle conflicts", () async { + var update = await r.table(tableName!).insert( + {'id': 1, 'birthYear': DateTime(1984)}, + {'conflict': 'update', 'return_changes': true}).run(connection!); + + expect(update['changes'][0]['new_val'].containsKey('birthYear'), + equals(true)); + expect(update['changes'][0]['new_val'].containsKey('name'), equals(true)); + expect(update['changes'][0]['old_val'].containsKey('birthYear'), + equals(false)); + expect(update['changes'][0]['old_val'].containsKey('name'), equals(true)); + expect(update['replaced'], equals(1)); + + var replace = await r.table(tableName!).insert( + {'id': 1, 'name': 'Jane Doe'}, + {'conflict': 'replace', 'return_changes': true}).run(connection!); + + expect(replace['changes'][0]['new_val'].containsKey('birthYear'), + equals(false)); + expect( + replace['changes'][0]['new_val'].containsKey('name'), equals(true)); + expect(replace['changes'][0]['old_val'].containsKey('birthYear'), + equals(true)); + expect( + replace['changes'][0]['old_val'].containsKey('name'), equals(true)); + expect(replace['replaced'], equals(1)); + + var custom = await r.table(tableName!).insert({ + 'id': 1, + 'name': 'Jane Doe' + }, { + 'conflict': (id, oldVal, newVal) => + {'id': oldVal('id'), 'err': 'bad record'}, + 'return_changes': true + }).run(connection!); + expect(custom['changes'][0]['new_val'].containsKey('err'), equals(true)); + expect(replace['replaced'], equals(1)); + }); + }); + + group("update command -> ", () { + test("should update all records in a table", () async { + Map update = await r + .table(tableName!) + .update({'lastModified': DateTime.now()}).run(connection!); + + expect(update['replaced'], equals(5)); + }); + + test("should update all records in a selection", () async { + Map updatedSelection = await r + .table(tableName!) + .getAll(1, 2) + .update({'newField': 33}).run(connection!); + + expect(updatedSelection['replaced'], equals(2)); + }); + + test("should update a single record", () async { + Map updatedSelection = await r + .table(tableName!) + .get(1) + .update({'newField2': 44}).run(connection!); + + expect(updatedSelection['replaced'], equals(1)); + }); + + test("should update with durability option", () async { + Map updatedSelection = await r + .table(tableName!) + .get(1) + .update({'newField2': 22}, {'durability': 'soft'}).run(connection!); + + expect(updatedSelection['replaced'], equals(1)); + }); + + test("should update with return_changes option", () async { + Map updatedSelection = await r.table(tableName!).get(1).update( + {'newField2': 11}, {'return_changes': 'always'}).run(connection!); + + expect(updatedSelection.containsKey('changes'), equals(true)); + }); + + test("should update with non_atomic option", () async { + Map updatedSelection = await r + .table(tableName!) + .get(1) + .update({'newField2': 00}, {'non_atomic': true}).run(connection!); + + expect(updatedSelection['replaced'], equals(1)); + }); + + test("should update with r.literal", () async { + Map updated = await r.table(tableName!).get(2).update({ + 'kit': r.literal({'bells': 'cow'}) + }, { + 'return_changes': true + }).run(connection!); + + Map oldVal = updated['changes'][0]['old_val']; + Map newVal = updated['changes'][0]['new_val']; + + expect(oldVal['kit'].containsKey('drums'), equals(true)); + expect(newVal['kit'].containsKey('bells'), equals(true)); + expect(newVal['kit'].containsKey('drums'), equals(false)); + }); + }); + + group("replace command -> ", () { + test("should replace a single selection", () async { + Map replaced = + await r.table(tableName!).get(1).replace({'id': 1}).run(connection!); + expect(replaced['replaced'], equals(1)); + }); + + test("should replace a selection", () async { + Map replaced = await r.table(tableName!).getAll(1, 2).replace((item) { + return item.pluck('kit', 'id'); + }, {'return_changes': true}).run(connection!); + + expect(replaced['replaced'], equals(1)); + + Map newVal = replaced['changes'][0]['new_val']; + Map oldVal = replaced['changes'][0]['old_val']; + + expect(newVal.containsKey('lastModified'), equals(false)); + expect(oldVal.containsKey('lastModified'), equals(true)); + }); + + test("should populate errors", () async { + Map replaceError = + await r.table(tableName!).get(1).replace({}).run(connection!); + + expect(replaceError['errors'], equals(1)); + expect(replaceError['first_error'], + equals('Inserted object must have primary key `id`:\n{}')); + }); + }); + + group("delete command -> ", () { + test("should delete a single selection", () async { + Map? deleted = await r + .table(tableName!) + .get(1) + .delete({'return_changes': true}).run(connection!); + + Map? newVal = deleted!['changes'][0]['new_val']; + Map? oldVal = deleted['changes'][0]['old_val']; + + expect(deleted['deleted'], equals(1)); + expect(newVal, equals(null)); + expect(oldVal!['id'], equals(1)); + }); + + test("should delete a selection", () async { + Map? deleted = await r + .table(tableName!) + .limit(2) + .delete({'return_changes': true}).run(connection!); + + expect(deleted!['changes'].length, equals(2)); + + Map? newVal = deleted['changes'][0]['new_val']; + Map? oldVal = deleted['changes'][0]['old_val']; + + expect(deleted['deleted'], equals(2)); + expect(newVal, equals(null)); + + expect(oldVal!.containsKey('name') || oldVal.containsKey('kit'), + equals(true)); + + newVal = deleted['changes'][1]['new_val']; + oldVal = deleted['changes'][1]['old_val']; + + expect(newVal, equals(null)); + expect(oldVal!.containsKey('name') || oldVal.containsKey('kit'), + equals(true)); + }); + + test("should delete all records on a table", () async { + await r.table(tableName!).delete().run(connection!); + Cursor results = await r.table(tableName!).run(connection!); + + List resList = await results.toList(); + expect(resList.isEmpty, equals(true)); + }); + }); + + group("sync command -> ", () { + test("should sync", () async { + Map syncComplete = await r.table(tableName!).sync().run(connection!); + + expect(syncComplete.containsKey('synced'), equals(true)); + expect(syncComplete['synced'], equals(1)); + }); + }); + + test("remove the test database", () async { + Map response = await r.dbDrop(testDbName!).run(connection!); + + expect(response.containsKey('config_changes'), equals(true)); + expect(response['dbs_dropped'], equals(1)); + }); +} diff --git a/incubation/.gitkeep b/incubation/.gitkeep new file mode 100644 index 0000000..e69de29 diff --git a/melos.yaml b/melos.yaml index f4eeb3b..39bbe58 100644 --- a/melos.yaml +++ b/melos.yaml @@ -11,8 +11,11 @@ name: protevus_platform repository: https://github.com/protevus/platform packages: - - apps/** + - common/** + - drivers/** - packages/** + - incubation/** + - apps/** - helpers/tools/** - examples/** diff --git a/packages/conditionable/.gitignore b/packages/conditionable/.gitignore new file mode 100644 index 0000000..3cceda5 --- /dev/null +++ b/packages/conditionable/.gitignore @@ -0,0 +1,7 @@ +# https://dart.dev/guides/libraries/private-files +# Created by `dart pub` +.dart_tool/ + +# Avoid committing pubspec.lock for library packages; see +# https://dart.dev/guides/libraries/private-files#pubspeclock. +pubspec.lock diff --git a/packages/conditionable/CHANGELOG.md b/packages/conditionable/CHANGELOG.md new file mode 100644 index 0000000..effe43c --- /dev/null +++ b/packages/conditionable/CHANGELOG.md @@ -0,0 +1,3 @@ +## 1.0.0 + +- Initial version. diff --git a/packages/conditionable/LICENSE.md b/packages/conditionable/LICENSE.md new file mode 100644 index 0000000..0fd0d03 --- /dev/null +++ b/packages/conditionable/LICENSE.md @@ -0,0 +1,10 @@ +The MIT License (MIT) + +The Laravel Framework is Copyright (c) Taylor Otwell +The Fabric Framework is Copyright (c) Vieo, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. \ No newline at end of file diff --git a/packages/conditionable/README.md b/packages/conditionable/README.md new file mode 100644 index 0000000..1ea6f75 --- /dev/null +++ b/packages/conditionable/README.md @@ -0,0 +1,165 @@ +# Platform Conditionable + +A Dart implementation of Laravel's Conditionable trait, providing fluent conditional execution with method chaining. + +## Features + +- Conditional method execution with `when` and `unless` +- Support for method chaining +- Cascade notation support with `whenThen` and `unlessThen` +- Fallback execution with `orElse` handlers +- Support for both direct values and closure conditions + +## Usage + +```dart +import 'package:platform_conditionable/platform_conditionable.dart'; + +// Add the Conditionable mixin to your class +class YourClass with Conditionable { + // Your class implementation +} +``` + +### Basic Conditional Execution + +```dart +class QueryBuilder with Conditionable { + final conditions = []; + + void addCondition(String condition) { + conditions.add(condition); + } +} + +final query = QueryBuilder(); + +// Using when +query.when(hasStatus, (self, _) { + (self as QueryBuilder).addCondition("status = 'active'"); +}); + +// Using unless +query.unless(category == null, (self, value) { + (self as QueryBuilder).addCondition("category = '$value'"); +}); +``` + +### Method Chaining with Cascade Notation + +```dart +class Config with Conditionable { + bool debugMode = false; + List features = []; +} + +final config = Config() + ..whenThen( + isDevelopment, + () => config.features.add('debug-toolbar'), + ) + ..unlessThen( + isProduction, + () => config.features.add('detailed-logs'), + ); +``` + +### Using Fallback Handlers + +```dart +final result = instance.when( + condition, + (self, value) => 'Primary result', + orElse: (self, value) => 'Fallback result', +); +``` + +### Closure Conditions + +```dart +instance.when( + () => someComplexCondition(), + (self, value) { + // Execute when condition is true + }, +); +``` + +## Features in Detail + +### The `when` Method + +Executes a callback if the condition is true: + +```dart +instance.when(condition, (self, value) { + // Executed if condition is true +}); +``` + +### The `unless` Method + +Executes a callback if the condition is false: + +```dart +instance.unless(condition, (self, value) { + // Executed if condition is false +}); +``` + +### Cascade Notation with `whenThen` and `unlessThen` + +For void operations that work well with cascade notation: + +```dart +instance + ..whenThen(condition1, () { + // Execute if condition1 is true + }) + ..unlessThen(condition2, () { + // Execute if condition2 is false + }); +``` + +### Fallback Handling + +All methods support fallback execution through `orElse`: + +```dart +instance.when( + condition, + (self, value) => 'Primary action', + orElse: (self, value) => 'Fallback action', +); + +instance.whenThen( + condition, + () => print('Primary action'), + orElse: () => print('Fallback action'), +); +``` + +## Example + +See the [example](example/platform_conditionable_example.dart) for a complete demonstration of all features, including: + +- Conditional query building +- Configuration setup +- Method chaining +- Fallback handlers +- Closure conditions + +## Important Notes + +1. When using callbacks that need to access the instance methods, cast the `self` parameter to your class type: + ```dart + instance.when(condition, (self, value) { + (self as YourClass).someMethod(); + }); + ``` + +2. The `whenThen` and `unlessThen` methods are designed for void operations and work well with cascade notation. + +3. Conditions can be either direct values or closures that return a value. + +4. All methods return the instance by default if no callback is provided, enabling method chaining. diff --git a/packages/conditionable/analysis_options.yaml b/packages/conditionable/analysis_options.yaml new file mode 100644 index 0000000..dee8927 --- /dev/null +++ b/packages/conditionable/analysis_options.yaml @@ -0,0 +1,30 @@ +# This file configures the static analysis results for your project (errors, +# warnings, and lints). +# +# This enables the 'recommended' set of lints from `package:lints`. +# This set helps identify many issues that may lead to problems when running +# or consuming Dart code, and enforces writing Dart using a single, idiomatic +# style and format. +# +# If you want a smaller set of lints you can change this to specify +# 'package:lints/core.yaml'. These are just the most critical lints +# (the recommended set includes the core lints). +# The core lints are also what is used by pub.dev for scoring packages. + +include: package:lints/recommended.yaml + +# Uncomment the following section to specify additional rules. + +# linter: +# rules: +# - camel_case_types + +# analyzer: +# exclude: +# - path/to/excluded/files/** + +# For more information about the core and recommended set of lints, see +# https://dart.dev/go/core-lints + +# For additional information about configuring this file, see +# https://dart.dev/guides/language/analysis-options diff --git a/packages/conditionable/doc/.gitkeep b/packages/conditionable/doc/.gitkeep new file mode 100644 index 0000000..e69de29 diff --git a/packages/conditionable/example/platform_conditionable_example.dart b/packages/conditionable/example/platform_conditionable_example.dart new file mode 100644 index 0000000..97e3397 --- /dev/null +++ b/packages/conditionable/example/platform_conditionable_example.dart @@ -0,0 +1,103 @@ +import 'package:platform_conditionable/platform_conditionable.dart'; + +// A simple builder class that demonstrates Conditionable usage +class QueryBuilder with Conditionable { + final List _conditions = []; + + void addCondition(String condition) { + _conditions.add(condition); + } + + String build() => _conditions.join(' AND '); +} + +// A configuration class showing conditional setup +class Config with Conditionable { + String environment = 'development'; + bool debugMode = false; + List features = []; + + @override + String toString() { + return 'Config(environment: $environment, ' + 'debugMode: $debugMode, features: $features)'; + } +} + +void main() { + // Example 1: Conditional Query Building + print('Example 1: Conditional Query Building'); + final query = QueryBuilder(); + + final hasStatus = true; + final status = 'active'; + final minAge = 21; + final category = null; + + query + ..when(hasStatus, (self, _) { + (self as QueryBuilder).addCondition("status = '$status'"); + }) + ..when(minAge >= 18, (self, _) { + (self as QueryBuilder).addCondition('age >= $minAge'); + }) + ..unless(category == null, (self, value) { + (self as QueryBuilder).addCondition("category = '$value'"); + }); + + print('Generated query: ${query.build()}'); + print('---\n'); + + // Example 2: Configuration Setup + print('Example 2: Configuration Setup'); + final config = Config(); + + // Using when with direct conditions + config + ..when(true, (self, _) { + (self as Config).debugMode = true; + }) + ..whenThen( + config.environment == 'development', + () => config.features.add('debug-toolbar'), + ); + + // Using unless with closures + config.unless(() => config.environment == 'production', (self, _) { + (self as Config).features.add('detailed-logs'); + }); + + print('Final config: $config'); + print('---\n'); + + // Example 3: Conditional Value Resolution + print('Example 3: Conditional Value Resolution'); + final mode = config.when( + config.debugMode, + (self, _) => 'Debug Mode Active', + orElse: (self, _) => 'Production Mode', + ); + print('Current mode: $mode'); + + final featureStatus = config.unless( + config.features.isEmpty, + (self, _) => 'Active features: ${config.features}', + orElse: (self, _) => 'No features enabled', + ); + print('Features: $featureStatus'); + print('---\n'); + + // Example 4: Using orElse handlers with cascade notation + print('Example 4: Using orElse Handlers'); + config + ..whenThen( + false, + () => print('This will not execute'), + orElse: () => print('Using fallback configuration'), + ) + ..unlessThen( + config.environment == 'production', + () => print('Running in development mode'), + orElse: () => print('Running in production mode'), + ); +} diff --git a/packages/conditionable/lib/platform_conditionable.dart b/packages/conditionable/lib/platform_conditionable.dart new file mode 100644 index 0000000..22c9051 --- /dev/null +++ b/packages/conditionable/lib/platform_conditionable.dart @@ -0,0 +1,4 @@ +/// A library that provides conditional execution with a fluent interface. +library platform_conditionable; + +export 'src/conditionable.dart'; diff --git a/packages/conditionable/lib/src/conditionable.dart b/packages/conditionable/lib/src/conditionable.dart new file mode 100644 index 0000000..915689d --- /dev/null +++ b/packages/conditionable/lib/src/conditionable.dart @@ -0,0 +1,119 @@ +import 'package:meta/meta.dart'; + +/// A mixin that provides conditional execution with a fluent interface. +/// +/// This mixin allows for conditional method chaining similar to Laravel's +/// Conditionable trait. +mixin Conditionable { + /// Executes the callback if the given value is truthy. + /// + /// The value can be either a direct value or a closure that returns a value. + /// If a callback is provided and the condition is true, it will be executed + /// with the current instance and value as parameters. + /// + /// ```dart + /// instance.when(condition, (self, value) { + /// // Execute when condition is true + /// }); + /// ``` + dynamic when( + dynamic value, + dynamic Function(dynamic self, dynamic value)? callback, { + dynamic Function(dynamic self, dynamic value)? orElse, + }) { + // Evaluate the condition + final condition = value is Function ? value() : value; + + if (condition == true) { + // Execute callback if condition is true + return callback?.call(this, condition) ?? this; + } else if (orElse != null) { + // Execute orElse callback if provided + return orElse(this, condition); + } + + return this; + } + + /// Executes the callback if the given value is falsy. + /// + /// The value can be either a direct value or a closure that returns a value. + /// If a callback is provided and the condition is false, it will be executed + /// with the current instance and value as parameters. + /// + /// ```dart + /// instance.unless(condition, (self, value) { + /// // Execute when condition is false + /// }); + /// ``` + dynamic unless( + dynamic value, + dynamic Function(dynamic self, dynamic value)? callback, { + dynamic Function(dynamic self, dynamic value)? orElse, + }) { + // Evaluate the condition + final condition = value is Function ? value() : value; + + if (condition != true) { + // Execute callback if condition is false + return callback?.call(this, condition) ?? this; + } else if (orElse != null) { + // Execute orElse callback if provided + return orElse(this, condition); + } + + return this; + } + + /// Creates a conditional chain that can be used with method cascades. + /// + /// ```dart + /// instance + /// ..whenThen(condition, () { + /// // Execute when condition is true + /// }) + /// ..unlessThen(otherCondition, () { + /// // Execute when otherCondition is false + /// }); + /// ``` + @useResult + void whenThen( + dynamic value, + void Function() callback, { + void Function()? orElse, + }) { + final condition = value is Function ? value() : value; + + if (condition == true) { + callback(); + } else if (orElse != null) { + orElse(); + } + } + + /// Creates a negative conditional chain that can be used with method cascades. + /// + /// ```dart + /// instance + /// ..unlessThen(condition, () { + /// // Execute when condition is false + /// }) + /// ..whenThen(otherCondition, () { + /// // Execute when otherCondition is true + /// }); + /// ``` + @useResult + void unlessThen( + dynamic value, + void Function() callback, { + void Function()? orElse, + }) { + final condition = value is Function ? value() : value; + + if (condition != true) { + callback(); + } else if (orElse != null) { + orElse(); + } + } +} diff --git a/packages/conditionable/pubspec.yaml b/packages/conditionable/pubspec.yaml new file mode 100644 index 0000000..3250176 --- /dev/null +++ b/packages/conditionable/pubspec.yaml @@ -0,0 +1,13 @@ +name: platform_conditionable +description: A Dart implementation of Laravel's Conditionable trait, providing fluent conditional execution +version: 0.1.0 + +environment: + sdk: ">=3.0.0 <4.0.0" + +dependencies: + meta: ^1.9.0 + +dev_dependencies: + lints: ^2.1.0 + test: ^1.24.0 diff --git a/packages/conditionable/test/conditionable_test.dart b/packages/conditionable/test/conditionable_test.dart new file mode 100644 index 0000000..edabb9b --- /dev/null +++ b/packages/conditionable/test/conditionable_test.dart @@ -0,0 +1,189 @@ +import 'package:test/test.dart'; +import '../lib/platform_conditionable.dart'; + +class TestClass with Conditionable { + String value = ''; + + TestClass append(String text) { + value += text; + return this; + } +} + +void main() { + group('Conditionable', () { + late TestClass instance; + + setUp(() { + instance = TestClass(); + }); + + group('when', () { + test('executes callback when condition is true', () { + instance.when(true, (self, value) { + (self as TestClass).append('true'); + }); + + expect(instance.value, equals('true')); + }); + + test('skips callback when condition is false', () { + instance.when(false, (self, value) { + (self as TestClass).append('false'); + }); + + expect(instance.value, isEmpty); + }); + + test('executes orElse when condition is false', () { + instance.when( + false, + (self, value) { + (self as TestClass).append('false'); + }, + orElse: (self, value) { + (self as TestClass).append('else'); + }, + ); + + expect(instance.value, equals('else')); + }); + + test('evaluates closure conditions', () { + instance.when(() => true, (self, value) { + (self as TestClass).append('closure'); + }); + + expect(instance.value, equals('closure')); + }); + + test('supports method chaining', () { + instance.when(true, (self, value) { + (self as TestClass).append('first'); + return self; + }).when(true, (self, value) { + (self as TestClass).append('-second'); + return self; + }); + + expect(instance.value, equals('first-second')); + }); + }); + + group('unless', () { + test('executes callback when condition is false', () { + instance.unless(false, (self, value) { + (self as TestClass).append('false'); + }); + + expect(instance.value, equals('false')); + }); + + test('skips callback when condition is true', () { + instance.unless(true, (self, value) { + (self as TestClass).append('true'); + }); + + expect(instance.value, isEmpty); + }); + + test('executes orElse when condition is true', () { + instance.unless( + true, + (self, value) { + (self as TestClass).append('true'); + }, + orElse: (self, value) { + (self as TestClass).append('else'); + }, + ); + + expect(instance.value, equals('else')); + }); + + test('evaluates closure conditions', () { + instance.unless(() => false, (self, value) { + (self as TestClass).append('closure'); + }); + + expect(instance.value, equals('closure')); + }); + }); + + group('whenThen', () { + test('executes callback in method cascade when condition is true', () { + instance + ..whenThen(true, () { + instance.append('cascade'); + }) + ..append('-end'); + + expect(instance.value, equals('cascade-end')); + }); + + test('executes orElse in method cascade when condition is false', () { + instance + ..whenThen( + false, + () { + instance.append('false'); + }, + orElse: () { + instance.append('else'); + }, + ) + ..append('-end'); + + expect(instance.value, equals('else-end')); + }); + }); + + group('unlessThen', () { + test('executes callback in method cascade when condition is false', () { + instance + ..unlessThen(false, () { + instance.append('cascade'); + }) + ..append('-end'); + + expect(instance.value, equals('cascade-end')); + }); + + test('executes orElse in method cascade when condition is true', () { + instance + ..unlessThen( + true, + () { + instance.append('true'); + }, + orElse: () { + instance.append('else'); + }, + ) + ..append('-end'); + + expect(instance.value, equals('else-end')); + }); + }); + + test('complex chaining with mixed conditions', () { + instance + ..when(true, (self, value) { + (self as TestClass).append('1'); + return self; + }) + ..unless(false, (self, value) { + (self as TestClass).append('-2'); + return self; + }) + ..whenThen(true, () { + instance.append('-3'); + }) + ..unlessThen(false, () { + instance.append('-4'); + }); + + expect(instance.value, equals('1-2-3-4')); + }); + }); +} diff --git a/packages/foundation/CHANGELOG.md b/packages/foundation/CHANGELOG.md index 28c3e12..27044f3 100644 --- a/packages/foundation/CHANGELOG.md +++ b/packages/foundation/CHANGELOG.md @@ -88,8 +88,8 @@ ## 4.2.0 -* Updated to `package:belatuk_combinator` -* Updated to `package:belatuk_merge_map` +* Updated to `package:platform_combinator` +* Updated to `package:platform_merge_map` * Updated linter to `package:lints` ## 4.1.3 @@ -108,7 +108,7 @@ ## 4.1.0 -* Replaced `http_server` with `belatuk_http_server` +* Replaced `http_server` with `platform_http_server` ## 4.0.4 diff --git a/packages/foundation/lib/core.dart b/packages/foundation/lib/core.dart index 30fc78c..c850df0 100644 --- a/packages/foundation/lib/core.dart +++ b/packages/foundation/lib/core.dart @@ -3,5 +3,5 @@ library platform_foundation; export 'package:platform_support/exceptions.dart'; export 'package:platform_model/model.dart'; -export 'package:platform_route/route.dart'; +export 'package:platform_routing/route.dart'; export 'src/core/core.dart'; diff --git a/packages/foundation/lib/http.dart b/packages/foundation/lib/http.dart index 5f00eff..632a9cb 100644 --- a/packages/foundation/lib/http.dart +++ b/packages/foundation/lib/http.dart @@ -1 +1 @@ -export 'src/http/http.dart'; +export 'src/http/v1/http.dart'; diff --git a/packages/foundation/lib/http2.dart b/packages/foundation/lib/http2.dart index 7fd2aaf..01ef0c2 100644 --- a/packages/foundation/lib/http2.dart +++ b/packages/foundation/lib/http2.dart @@ -1,3 +1,3 @@ -export 'src/http2/protevus_http2.dart'; -export 'src/http2/http2_request_context.dart'; -export 'src/http2/http2_response_context.dart'; +export 'src/http/v2/protevus_http2.dart'; +export 'src/http/v2/http2_request_context.dart'; +export 'src/http/v2/http2_response_context.dart'; diff --git a/packages/foundation/lib/src/core/controller.dart b/packages/foundation/lib/src/core/controller.dart index 842a5cd..5590400 100644 --- a/packages/foundation/lib/src/core/controller.dart +++ b/packages/foundation/lib/src/core/controller.dart @@ -2,7 +2,7 @@ library platform_foundation.http.controller; import 'dart:async'; import 'package:platform_container/container.dart'; -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:meta/meta.dart'; import 'package:recase/recase.dart'; import '../core/core.dart'; diff --git a/packages/foundation/lib/src/core/driver.dart b/packages/foundation/lib/src/core/driver.dart index 4c262de..d5415e1 100644 --- a/packages/foundation/lib/src/core/driver.dart +++ b/packages/foundation/lib/src/core/driver.dart @@ -2,8 +2,8 @@ import 'dart:async'; import 'dart:convert'; import 'dart:io' show Cookie; import 'package:platform_support/exceptions.dart'; -import 'package:platform_route/route.dart'; -import 'package:belatuk_combinator/belatuk_combinator.dart'; +import 'package:platform_routing/route.dart'; +import 'package:platform_combinator/combinator.dart'; import 'package:stack_trace/stack_trace.dart'; import 'package:tuple/tuple.dart'; import 'core.dart'; diff --git a/packages/foundation/lib/src/core/hostname_router.dart b/packages/foundation/lib/src/core/hostname_router.dart index 9b7e121..d8667f4 100644 --- a/packages/foundation/lib/src/core/hostname_router.dart +++ b/packages/foundation/lib/src/core/hostname_router.dart @@ -1,6 +1,6 @@ import 'dart:async'; import 'package:platform_container/container.dart'; -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:logging/logging.dart'; import 'env.dart'; import 'hostname_parser.dart'; diff --git a/packages/foundation/lib/src/core/request_context.dart b/packages/foundation/lib/src/core/request_context.dart index 5bc3603..8b3e873 100644 --- a/packages/foundation/lib/src/core/request_context.dart +++ b/packages/foundation/lib/src/core/request_context.dart @@ -8,7 +8,7 @@ import 'dart:io' import 'package:platform_container/container.dart'; import 'package:http_parser/http_parser.dart'; -import 'package:belatuk_http_server/belatuk_http_server.dart'; +import 'package:platform_http_server/http_server.dart'; import 'package:meta/meta.dart'; import 'package:mime/mime.dart'; import 'package:path/path.dart' as p; diff --git a/packages/foundation/lib/src/core/response_context.dart b/packages/foundation/lib/src/core/response_context.dart index 88f0704..da29f49 100644 --- a/packages/foundation/lib/src/core/response_context.dart +++ b/packages/foundation/lib/src/core/response_context.dart @@ -6,7 +6,7 @@ import 'dart:convert' as c show json; import 'dart:io' show BytesBuilder, Cookie; import 'dart:typed_data'; -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:file/file.dart'; import 'package:http_parser/http_parser.dart'; import 'package:mime/mime.dart'; diff --git a/packages/foundation/lib/src/core/routable.dart b/packages/foundation/lib/src/core/routable.dart index e4a8598..5824a24 100644 --- a/packages/foundation/lib/src/core/routable.dart +++ b/packages/foundation/lib/src/core/routable.dart @@ -3,7 +3,7 @@ library angel_framework.http.routable; import 'dart:async'; import 'package:platform_container/container.dart'; -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import '../util.dart'; import 'hooked_service.dart'; diff --git a/packages/foundation/lib/src/core/server.dart b/packages/foundation/lib/src/core/server.dart index e6d2876..792cf76 100644 --- a/packages/foundation/lib/src/core/server.dart +++ b/packages/foundation/lib/src/core/server.dart @@ -5,8 +5,8 @@ import 'dart:collection' show HashMap; import 'dart:convert'; import 'package:platform_container/container.dart'; import 'package:platform_support/exceptions.dart'; -import 'package:platform_route/route.dart'; -import 'package:belatuk_combinator/belatuk_combinator.dart'; +import 'package:platform_routing/route.dart'; +import 'package:platform_combinator/combinator.dart'; import 'package:http_parser/http_parser.dart'; import 'package:logging/logging.dart'; import 'package:mime/mime.dart'; diff --git a/packages/foundation/lib/src/core/service.dart b/packages/foundation/lib/src/core/service.dart index 051d91a..542712f 100644 --- a/packages/foundation/lib/src/core/service.dart +++ b/packages/foundation/lib/src/core/service.dart @@ -2,7 +2,7 @@ library platform_foundation.http.service; import 'dart:async'; import 'package:platform_support/exceptions.dart'; -import 'package:belatuk_merge_map/belatuk_merge_map.dart'; +import 'package:platform_merge_map/merge_map.dart'; import 'package:quiver/core.dart'; import '../util.dart'; import 'anonymous_service.dart'; diff --git a/packages/foundation/lib/src/http/http.dart b/packages/foundation/lib/src/http/v1/http.dart similarity index 100% rename from packages/foundation/lib/src/http/http.dart rename to packages/foundation/lib/src/http/v1/http.dart diff --git a/packages/foundation/lib/src/http/http_request_context.dart b/packages/foundation/lib/src/http/v1/http_request_context.dart similarity index 98% rename from packages/foundation/lib/src/http/http_request_context.dart rename to packages/foundation/lib/src/http/v1/http_request_context.dart index bbc0cb5..b7a661b 100644 --- a/packages/foundation/lib/src/http/http_request_context.dart +++ b/packages/foundation/lib/src/http/v1/http_request_context.dart @@ -4,7 +4,7 @@ import 'dart:io'; import 'package:platform_container/container.dart'; import 'package:http_parser/http_parser.dart'; -import '../core/core.dart'; +import '../../core/core.dart'; /// An implementation of [RequestContext] that wraps a [HttpRequest]. class HttpRequestContext extends RequestContext { diff --git a/packages/foundation/lib/src/http/http_response_context.dart b/packages/foundation/lib/src/http/v1/http_response_context.dart similarity index 99% rename from packages/foundation/lib/src/http/http_response_context.dart rename to packages/foundation/lib/src/http/v1/http_response_context.dart index ed3d06b..31044c8 100644 --- a/packages/foundation/lib/src/http/http_response_context.dart +++ b/packages/foundation/lib/src/http/v1/http_response_context.dart @@ -4,7 +4,7 @@ import 'dart:io' hide BytesBuilder; import 'dart:typed_data' show BytesBuilder; import 'package:http_parser/http_parser.dart'; -import '../core/core.dart'; +import '../../core/core.dart'; import 'http_request_context.dart'; /// An implementation of [ResponseContext] that abstracts over an [HttpResponse]. diff --git a/packages/foundation/lib/src/http/protevus_http.dart b/packages/foundation/lib/src/http/v1/protevus_http.dart similarity index 100% rename from packages/foundation/lib/src/http/protevus_http.dart rename to packages/foundation/lib/src/http/v1/protevus_http.dart diff --git a/packages/foundation/lib/src/http2/http2_request_context.dart b/packages/foundation/lib/src/http/v2/http2_request_context.dart similarity index 100% rename from packages/foundation/lib/src/http2/http2_request_context.dart rename to packages/foundation/lib/src/http/v2/http2_request_context.dart diff --git a/packages/foundation/lib/src/http2/http2_response_context.dart b/packages/foundation/lib/src/http/v2/http2_response_context.dart similarity index 100% rename from packages/foundation/lib/src/http2/http2_response_context.dart rename to packages/foundation/lib/src/http/v2/http2_response_context.dart diff --git a/packages/foundation/lib/src/http2/protevus_http2.dart b/packages/foundation/lib/src/http/v2/protevus_http2.dart similarity index 100% rename from packages/foundation/lib/src/http2/protevus_http2.dart rename to packages/foundation/lib/src/http/v2/protevus_http2.dart diff --git a/packages/foundation/pubspec.yaml b/packages/foundation/pubspec.yaml index 8d54cc3..3f5a4c9 100644 --- a/packages/foundation/pubspec.yaml +++ b/packages/foundation/pubspec.yaml @@ -9,12 +9,12 @@ environment: dependencies: platform_container: ^9.0.0 platform_model: ^9.0.0 - platform_route: ^9.0.0 + platform_routing: ^9.0.0 platform_support: ^9.0.0 platform_testing: ^9.0.0 - belatuk_merge_map: ^5.1.0 - belatuk_combinator: ^5.2.0 - belatuk_http_server: ^4.4.0 + platform_merge_map: ^5.1.0 + platform_combinator: ^5.2.0 + platform_http_server: ^4.4.0 charcode: ^1.3.1 file: ^7.0.1 http_parser: ^4.1.1 @@ -22,7 +22,7 @@ dependencies: logging: ^1.3.0 matcher: ^0.12.16 meta: ^1.16.0 - mime: ^1.0.0 + mime: ^2.0.0 path: ^1.9.1 quiver: ^3.2.2 recase: ^4.1.0 @@ -45,7 +45,7 @@ dev_dependencies: # path: ../http_exception # platform_model: # path: ../model -# platform_route: +# platform_routing: # path: ../route # platform_mock_request: # path: ../mock_request diff --git a/packages/foundation/test/response_header_test.dart b/packages/foundation/test/response_header_test.dart index 5b7bba0..91dff48 100644 --- a/packages/foundation/test/response_header_test.dart +++ b/packages/foundation/test/response_header_test.dart @@ -2,7 +2,7 @@ import 'dart:io'; import 'package:platform_container/mirrors.dart'; import 'package:platform_foundation/core.dart'; -import 'package:platform_foundation/src/http/protevus_http.dart'; +import 'package:platform_foundation/src/http/v1/protevus_http.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/.gitignore b/packages/routing/.gitignore similarity index 100% rename from packages/route/.gitignore rename to packages/routing/.gitignore diff --git a/packages/routing/AUTHORS.md b/packages/routing/AUTHORS.md new file mode 100644 index 0000000..ac95ab5 --- /dev/null +++ b/packages/routing/AUTHORS.md @@ -0,0 +1,12 @@ +Primary Authors +=============== + +* __[Thomas Hii](dukefirehawk.apps@gmail.com)__ + + Thomas is the current maintainer of the code base. He has refactored and migrated the + code base to support NNBD. + +* __[Tobe O](thosakwe@gmail.com)__ + + Tobe has written much of the original code prior to NNBD migration. He has moved on and + is no longer involved with the project. diff --git a/packages/route/CHANGELOG.md b/packages/routing/CHANGELOG.md similarity index 97% rename from packages/route/CHANGELOG.md rename to packages/routing/CHANGELOG.md index 34b37d6..c8db760 100644 --- a/packages/route/CHANGELOG.md +++ b/packages/routing/CHANGELOG.md @@ -30,7 +30,7 @@ ## 5.1.0 -* Updated to use `package:belatuk_combinator` +* Updated to use `package:platform_combinator` * Updated linter to `package:lints` ## 5.0.1 diff --git a/packages/routing/LICENSE b/packages/routing/LICENSE new file mode 100644 index 0000000..df5e063 --- /dev/null +++ b/packages/routing/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, dukefirehawk.com +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/packages/route/README.md b/packages/routing/README.md similarity index 100% rename from packages/route/README.md rename to packages/routing/README.md diff --git a/packages/routing/analysis_options.yaml b/packages/routing/analysis_options.yaml new file mode 100644 index 0000000..572dd23 --- /dev/null +++ b/packages/routing/analysis_options.yaml @@ -0,0 +1 @@ +include: package:lints/recommended.yaml diff --git a/packages/route/example/main.dart b/packages/routing/example/main.dart similarity index 96% rename from packages/route/example/main.dart rename to packages/routing/example/main.dart index dc351a1..f8f4efa 100644 --- a/packages/route/example/main.dart +++ b/packages/routing/example/main.dart @@ -1,6 +1,6 @@ import 'dart:math'; -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; void main() { final router = Router(); diff --git a/packages/route/lib/browser.dart b/packages/routing/lib/browser.dart similarity index 100% rename from packages/route/lib/browser.dart rename to packages/routing/lib/browser.dart diff --git a/packages/route/lib/route.dart b/packages/routing/lib/route.dart similarity index 79% rename from packages/route/lib/route.dart rename to packages/routing/lib/route.dart index 8cd7ae4..a1a8105 100644 --- a/packages/route/lib/route.dart +++ b/packages/routing/lib/route.dart @@ -1,4 +1,4 @@ -library platform_route; +library platform_routing; export 'src/middleware_pipeline.dart'; export 'src/router.dart'; diff --git a/packages/route/lib/src/grammar.dart b/packages/routing/lib/src/grammar.dart similarity index 100% rename from packages/route/lib/src/grammar.dart rename to packages/routing/lib/src/grammar.dart diff --git a/packages/route/lib/src/middleware_pipeline.dart b/packages/routing/lib/src/middleware_pipeline.dart similarity index 100% rename from packages/route/lib/src/middleware_pipeline.dart rename to packages/routing/lib/src/middleware_pipeline.dart diff --git a/packages/route/lib/src/route.dart b/packages/routing/lib/src/route.dart similarity index 100% rename from packages/route/lib/src/route.dart rename to packages/routing/lib/src/route.dart diff --git a/packages/route/lib/src/router.dart b/packages/routing/lib/src/router.dart similarity index 99% rename from packages/route/lib/src/router.dart rename to packages/routing/lib/src/router.dart index 8585b10..fd4ab9a 100644 --- a/packages/route/lib/src/router.dart +++ b/packages/routing/lib/src/router.dart @@ -1,7 +1,7 @@ -library platform_route.src.router; +library platform_routing.src.router; import 'dart:async'; -import 'package:belatuk_combinator/belatuk_combinator.dart'; +import 'package:platform_combinator/combinator.dart'; import 'package:string_scanner/string_scanner.dart'; import '../string_util.dart'; diff --git a/packages/route/lib/src/routing_exception.dart b/packages/routing/lib/src/routing_exception.dart similarity index 100% rename from packages/route/lib/src/routing_exception.dart rename to packages/routing/lib/src/routing_exception.dart diff --git a/packages/route/lib/src/routing_result.dart b/packages/routing/lib/src/routing_result.dart similarity index 100% rename from packages/route/lib/src/routing_result.dart rename to packages/routing/lib/src/routing_result.dart diff --git a/packages/route/lib/src/symlink_route.dart b/packages/routing/lib/src/symlink_route.dart similarity index 100% rename from packages/route/lib/src/symlink_route.dart rename to packages/routing/lib/src/symlink_route.dart diff --git a/packages/route/lib/string_util.dart b/packages/routing/lib/string_util.dart similarity index 100% rename from packages/route/lib/string_util.dart rename to packages/routing/lib/string_util.dart diff --git a/packages/route/pubspec.yaml b/packages/routing/pubspec.yaml similarity index 90% rename from packages/route/pubspec.yaml rename to packages/routing/pubspec.yaml index f066f8e..e203b0d 100644 --- a/packages/route/pubspec.yaml +++ b/packages/routing/pubspec.yaml @@ -1,4 +1,4 @@ -name: platform_route +name: platform_routing version: 9.0.0 description: A powerful, isomorphic routing library for Dart. It is mainly used in the Protevus Platform, but can be used in Flutter and on the Web. homepage: https://protevus.com @@ -7,7 +7,7 @@ repository: https://git.protevus.com/protevus/platform/src/branch/main/packages/ environment: sdk: '>=3.3.0 <4.0.0' dependencies: - belatuk_combinator: ^5.2.0 + platform_combinator: ^5.2.0 string_scanner: ^1.4.0 path: ^1.9.1 dev_dependencies: diff --git a/packages/route/repubspec.yaml b/packages/routing/repubspec.yaml similarity index 100% rename from packages/route/repubspec.yaml rename to packages/routing/repubspec.yaml diff --git a/packages/route/test/chain_nest_test.dart b/packages/routing/test/chain_nest_test.dart similarity index 91% rename from packages/route/test/chain_nest_test.dart rename to packages/routing/test/chain_nest_test.dart index c136cb4..d85e3d6 100644 --- a/packages/route/test/chain_nest_test.dart +++ b/packages/routing/test/chain_nest_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/navigate_test.dart b/packages/routing/test/navigate_test.dart similarity index 95% rename from packages/route/test/navigate_test.dart rename to packages/routing/test/navigate_test.dart index e9bfe45..77a3cdd 100644 --- a/packages/route/test/navigate_test.dart +++ b/packages/routing/test/navigate_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/params_test.dart b/packages/routing/test/params_test.dart similarity index 96% rename from packages/route/test/params_test.dart rename to packages/routing/test/params_test.dart index cc0f111..067edb9 100644 --- a/packages/route/test/params_test.dart +++ b/packages/routing/test/params_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/parse_test.dart b/packages/routing/test/parse_test.dart similarity index 90% rename from packages/route/test/parse_test.dart rename to packages/routing/test/parse_test.dart index c691697..6dd0de3 100644 --- a/packages/route/test/parse_test.dart +++ b/packages/routing/test/parse_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/root_test.dart b/packages/routing/test/root_test.dart similarity index 86% rename from packages/route/test/root_test.dart rename to packages/routing/test/root_test.dart index 651e0e1..c3e946f 100644 --- a/packages/route/test/root_test.dart +++ b/packages/routing/test/root_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/server_test.dart b/packages/routing/test/server_test.dart similarity index 99% rename from packages/route/test/server_test.dart rename to packages/routing/test/server_test.dart index 0aaa66f..2d90391 100644 --- a/packages/route/test/server_test.dart +++ b/packages/routing/test/server_test.dart @@ -1,6 +1,6 @@ import 'dart:convert'; import 'dart:io'; -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:http/http.dart' as http; import 'package:test/test.dart'; diff --git a/packages/route/test/strip_test.dart b/packages/routing/test/strip_test.dart similarity index 94% rename from packages/route/test/strip_test.dart rename to packages/routing/test/strip_test.dart index 42d636c..51d0590 100644 --- a/packages/route/test/strip_test.dart +++ b/packages/routing/test/strip_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/string_util.dart'; +import 'package:platform_routing/string_util.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/uri_decode_test.dart b/packages/routing/test/uri_decode_test.dart similarity index 90% rename from packages/route/test/uri_decode_test.dart rename to packages/routing/test/uri_decode_test.dart index c3d543d..ffeba23 100644 --- a/packages/route/test/uri_decode_test.dart +++ b/packages/routing/test/uri_decode_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/test/wildcard_test.dart b/packages/routing/test/wildcard_test.dart similarity index 96% rename from packages/route/test/wildcard_test.dart rename to packages/routing/test/wildcard_test.dart index 862e37b..5c17f77 100644 --- a/packages/route/test/wildcard_test.dart +++ b/packages/routing/test/wildcard_test.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/route.dart'; +import 'package:platform_routing/route.dart'; import 'package:test/test.dart'; void main() { diff --git a/packages/route/web/hash/basic.dart b/packages/routing/web/hash/basic.dart similarity index 62% rename from packages/route/web/hash/basic.dart rename to packages/routing/web/hash/basic.dart index 39e4eb4..b797ac9 100644 --- a/packages/route/web/hash/basic.dart +++ b/packages/routing/web/hash/basic.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/browser.dart'; +import 'package:platform_routing/browser.dart'; import '../shared/basic.dart'; void main() => basic(BrowserRouter(hash: true)); diff --git a/packages/route/web/hash/basic.html b/packages/routing/web/hash/basic.html similarity index 100% rename from packages/route/web/hash/basic.html rename to packages/routing/web/hash/basic.html diff --git a/packages/route/web/index.html b/packages/routing/web/index.html similarity index 100% rename from packages/route/web/index.html rename to packages/routing/web/index.html diff --git a/packages/route/web/push_state/basic.dart b/packages/routing/web/push_state/basic.dart similarity index 59% rename from packages/route/web/push_state/basic.dart rename to packages/routing/web/push_state/basic.dart index 06483ca..d1efa38 100644 --- a/packages/route/web/push_state/basic.dart +++ b/packages/routing/web/push_state/basic.dart @@ -1,4 +1,4 @@ -import 'package:platform_route/browser.dart'; +import 'package:platform_routing/browser.dart'; import '../shared/basic.dart'; void main() => basic(BrowserRouter()); diff --git a/packages/route/web/push_state/basic.html b/packages/routing/web/push_state/basic.html similarity index 100% rename from packages/route/web/push_state/basic.html rename to packages/routing/web/push_state/basic.html diff --git a/packages/route/web/shared/basic.dart b/packages/routing/web/shared/basic.dart similarity index 95% rename from packages/route/web/shared/basic.dart rename to packages/routing/web/shared/basic.dart index a64ee1d..985c02f 100644 --- a/packages/route/web/shared/basic.dart +++ b/packages/routing/web/shared/basic.dart @@ -1,5 +1,5 @@ import 'dart:html'; -import 'package:platform_route/browser.dart'; +import 'package:platform_routing/browser.dart'; void basic(BrowserRouter router) { final $h1 = window.document.querySelector('h1'); diff --git a/packages/support/LICENSE.md b/packages/support/LICENSE.md new file mode 100644 index 0000000..0fd0d03 --- /dev/null +++ b/packages/support/LICENSE.md @@ -0,0 +1,10 @@ +The MIT License (MIT) + +The Laravel Framework is Copyright (c) Taylor Otwell +The Fabric Framework is Copyright (c) Vieo, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. \ No newline at end of file diff --git a/packages/support/lib/src/exceptions/http_exception.dart b/packages/support/lib/src/exceptions/http_exception.dart index 07e5a38..dbc0be9 100644 --- a/packages/support/lib/src/exceptions/http_exception.dart +++ b/packages/support/lib/src/exceptions/http_exception.dart @@ -1,6 +1,5 @@ library platform_http_exception; -//import 'package:dart2_constant/convert.dart'; import 'dart:convert'; /// Exception class that can be serialized to JSON and serialized to clients. diff --git a/packages/support/specification.yaml b/packages/support/specification.yaml new file mode 100644 index 0000000..fb6c5dc --- /dev/null +++ b/packages/support/specification.yaml @@ -0,0 +1,286 @@ +# Laravel Illuminate Support Package Specification + +name: support +description: Core support utilities and helper functions for the framework +version: 1.0.0 + +dependencies: + required: + collections: ^1.0.0 + contracts: ^1.0.0 + macroable: ^1.0.0 + +components: + # Core String Manipulation + Str: + description: String manipulation utilities + methods: + - after + - before + - between + - camel + - contains + - endsWith + - finish + - is + - isAscii + - kebab + - length + - limit + - lower + - orderedUuid + - padBoth + - padLeft + - padRight + - plural + - random + - replace + - replaceArray + - replaceFirst + - replaceLast + - singular + - slug + - snake + - start + - startsWith + - studly + - title + - ucfirst + - upper + - uuid + - words + + # Service Provider System + ServiceProvider: + description: Base service provider for package registration and bootstrapping + methods: + - register + - boot + - provides + - when + - defer + + # Facade System + Facade: + description: Base facade class for static proxy interface + methods: + - getFacadeAccessor + - getFacadeRoot + - clearResolvedInstance + - setFacadeApplication + - resolveFacadeInstance + + # Collection Handling + Collection: + description: Wrapper for array manipulation with fluent interface + methods: + - all + - average + - chunk + - collapse + - combine + - concat + - contains + - count + - diff + - each + - every + - except + - filter + - first + - flatMap + - flatten + - flip + - forget + - get + - groupBy + - has + - implode + - intersect + - isEmpty + - isNotEmpty + - keyBy + - keys + - last + - map + - mapInto + - mapSpread + - mapToGroups + - mapWithKeys + - max + - median + - merge + - min + - mode + - only + - pad + - partition + - pipe + - pluck + - random + - reduce + - reject + - reverse + - search + - shift + - shuffle + - slice + - sort + - sortBy + - sortByDesc + - splice + - split + - sum + - take + - tap + - times + - toArray + - toJson + - transform + - union + - unique + - values + - when + - where + - whereIn + - whereNotIn + - zip + + # Helper Traits + traits: + Macroable: + description: Allows dynamic method registration on classes + methods: + - macro + - mixin + - hasMacro + - flushMacros + + Conditionable: + description: Adds fluent conditional execution + methods: + - when + - unless + + Tappable: + description: Provides tap helper for debugging and chaining + methods: + - tap + + # Optional Features + optional: + - name: HtmlString + description: HTML string wrapper that prevents double encoding + + - name: MessageBag + description: Error message container + + - name: Optional + description: Nullable object wrapper + + - name: Pluralizer + description: Word pluralization utilities + +# Helper Functions +helpers: + array: + - array_add + - array_collapse + - array_divide + - array_dot + - array_except + - array_first + - array_flatten + - array_forget + - array_get + - array_has + - array_last + - array_only + - array_pluck + - array_prepend + - array_pull + - array_random + - array_set + - array_sort + - array_sort_recursive + - array_where + - array_wrap + + string: + - camel_case + - class_basename + - e + - ends_with + - kebab_case + - preg_replace_array + - snake_case + - starts_with + - str_after + - str_before + - str_contains + - str_finish + - str_is + - str_limit + - str_plural + - str_random + - str_singular + - str_slug + - str_start + - studly_case + - title_case + + misc: + - app + - auth + - back + - base_path + - bcrypt + - blank + - broadcast + - cache + - config + - cookie + - csrf_field + - csrf_token + - dd + - decrypt + - dispatch + - encrypt + - env + - event + - factory + - filled + - info + - logger + - method_field + - now + - old + - optional + - policy + - redirect + - report + - request + - rescue + - resolve + - response + - retry + - session + - tap + - throw_if + - throw_unless + - today + - trans + - trans_choice + - url + - validator + - view + - with + +# Testing Utilities +testing: + fakes: + - EventFake + - MailFake + - NotificationFake + - QueueFake + - BusFake