proj-oot-ootLibrariesNotes3

Difference between revision 18 and current revision

No diff available.

---

lua libs/frameworks/etc:

https://love2d.org/ https://github.com/kikito/middleclass (oop) https://github.com/mikelovesrobots/lua-enumerable A port of ruby's Enumerable module to Lua https://github.com/makotok/Hanappe/tree/master/projects/hanappe-framework/src/hp/lang http://moonscript.org/ a dynamic scripting language that compiles into Lua http://leafo.net/lapis/ web framework for lua or moonscript https://github.com/leafo/moonlisp Lisp variant that compiles directly to Lua ; written in MoonScript? and depends on the MoonScript? runtime

---

Rust has a small, platform-independent library 'libcore', and a larger 'libstd':

" The largest new feature in 1.6 is that libcore is now stable! Rust’s standard library is two-tiered: there’s a small core library, libcore, and the full standard library, libstd, that builds on top of it. libcore is completely platform agnostic, and requires only a handful of external symbols to be defined. Rust’s libstd builds on top of libcore, adding support for memory allocation, I/O, and concurrency. Applications using Rust in the embedded space, as well as those writing operating systems, often eschew libstd, using only libcore. "

---

check out Rust's crates versioning spec:

" [dependencies] glob = "0.0.3" num = "0.0.4"

The string value for each key in this table is a semver version requirement.

Caret requirements allow SemVer? compatible updates to a specified version.

^1.2.3 is an example of a caret requirement.

When considering “compatible” versions, 0.1 and 0.2 are not considered compatible, but 1.0 and 1.1 are for example. If no operator is specified, this is the default requirement (e.g. 1.3 is the same as ^1.3).

0.0.x is not considered compatible with any other version. Missing minor and patch versions are desugared to 0 but allow flexibility for that value.

^1.2.3 := >=1.2.3 <2.0.0 ^0.2.3 := >=0.2.3 <0.3.0 ^0.0.3 := >=0.0.3 <0.0.4 ^0.0 := >=0.0.0 <0.1.0 ^0 := >=0.0.0 <1.0.0

Tilde requirements specify a minimal version with some ability to update.

~1.2.3 is an example of a tilde requirement.

~1.2.3 := >=1.2.3 <1.3.0 ~1.2 := >=1.2.0 <1.3.0 ~1 := >=1.0.0 <2.0.0 "

you can also use '=' to specify an exact version

-- [1]

---

"

    sizeof(seq)/sizeof(seq[0]), C, statically allocated;
    strlen(seq), C, null terminated;
    len(seq), Python;
    length seq, Haskell;
    (count seq), Clojure;
    seq.length, JavaScript lists, Java arrays;
    seq.size(), Java sequences, jQuery expressions (deprecated)
    seq.count(), Django querysets; and
    SeqType'Length, Ada arrays

I've probably forgotten some, but please excuse me if I don't remember the ones I need when I need them. "

---

"After a few years of using and abusing Ruby's meta-programming, I've come to understand it as an anti-pattern, you incur technical debt when you use it and should refactor it away as soon as you know what you're trying to do with it. It's one thing to enshrine it as a part of the language, like C++ templates and the C preprocessor, because that forces you (hopefully!) to think about things like maintainability and concern separation. It's quite another to ad-hoc what amounts to a language extension to paper over quick-and-dirty design. And there's not much middle ground between the two. I'll ask myself the question, "do I want to extract this into a gem and maintain it separately?" and run pretty quickly into the normal coding equivalent of "am I making a game engine or a game?" My vision has to be pretty darned awesome for the answer to be "both."..Meta-programming is great to have, having access to a credit card is way better than not having one if you have the discipline to use it properly. " [2]

---

" The Lua standard library uses the POSIX C library, making it portable; but that also means that you can’t use sockets, it doesn’t support signals or timers, and all OS calls block the thread. Using the POSIX C library is like going backwards in time.

Enter libuv. Libuv is a fully asynchronous, multi-platform library that was originally written to support node.js but is now being used to power many languages. Tim Caswell (aka creationix) wrote a binding to this library called “luv”. He then went on to create a lightweight “base” executable, called “luvi”, that is composed of various C libraries with Lua interfaces. All of this can be linked statically into a binary that is less than 5 megabytes and includes libuv, zlib, openssl, pcre, and, of course, the Lua runtime with bindings to the aforementioned libraries.

You can use luvi to run Lua code, but it also comes with built-in support for bundling your code directly into the binary. This makes it super-easy to ship your Lua code. I found out about this project in November 2014, and started following its progress. "

---

http://davidvgalbraith.com/how-i-fixed-atom/ gives an example of a recursive 'regex':

(?<m>[^()]*\((?:\g<m>

[^()]*)\)[^()]*)*

in this language, '(?<m>...)' is a named capture group with name 'm', and '(?:\g<m>...)' is a recursive 'call' to regex <m>

---

https://github.com/electronicarts/EASTL/blob/master/doc/EASTL%20Quick%20Reference.pdf

pcwalton 2 days ago

We on the Rust team got to speak about the lessons learned from the EASTL with Paul Pedriana (the author of much of it) while designing the Rust standard library. It's a significant influence on the proposed allocators API currently in RFC. The EASTL is worth looking at for anyone interested in designing libraries that work well in low-memory environments. (Much of this library was written with devices like the Nintendo DS in mind, which had 4 MB--4 MB!--of RAM as I recall.)

reply

CyberDildonics? 2 days ago

What parts of the EASTL are faster than the normal STL? Does it have better maps?

Edit: From a look at the source it definitely at least has some extra map classes for more specialized uses.

reply

corysama 2 days ago

One goal of EASTL is to be faster in non-optimized builds compared to other variants. If a game is running too slow to be playable, it can become impossible to debug! This was especially a problem on consoles with no OOE. It achieves this by sacrificing a lot of encapsulation and accepting non-DRY, manually inlined function implementations.

reply

to3m 1 day ago

I didn't realise this was a goal, but it's good that it was. C++'s prime faults tend to be spiteful iteration time (common cause: too many templates) and appalling performance in unoptimized builds (common cause: too many function calls, possibly due to overloaded operators and overly-finely-grained code that relies on inlining not to run like shit).

The famous maxim about how much cleverer you have to be to debug code than to write it always applies! No need to make things worse by forcing yourself to debug the optimized build.

reply

 JohnLeTigre 2 hours ago

http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n227... search for "20 - Performance comparison"

reply

 United857 2 days ago

Writeup on the key design and advantages of EASTL for their target platforms (game consoles): http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n227...

reply

http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2271.html

---

albertzeyer 2 days ago

STL is maybe a bit missleading because it doesn't seem like it is a drop-in replacement for the STL.

However, this is certainly interesting because the emphasis on speed (and thus also on simplicity) is what I'm sometimes missing in the real STL or in Boost. That's exactly what you need in a game but also in much other performance critical code.

As far as I remember, Chrome has used the STL earlier. But I looked now and I see this: https://chromium.googlesource.com/chromium/blink/+/master/So...

Gecko also seem to have a lot of custom stuff: https://github.com/mozilla/gecko-dev/tree/master/xpcom/strin...

Doom3 idLib: https://github.com/id-Software/DOOM-3-BFG/tree/master/neo/id...

Unreal engine: https://answers.unrealengine.com/questions/2695/rocket-and-t...

You'll find much more similar libs when you search in some of the big games.

reply

pcwalton 2 days ago

> As far as I remember, Chrome has used the STL earlier. But I looked now and I see this:

The WTF (Web Template Framework) is actually inherited from WebKit? and used in the core rendering engine. I'm not sure how widely it's used in the Google parts of Chromium.

> Gecko also seem to have a lot of custom stuff

That's the old XPCOM stuff. While those string types continue to get a lot of use, the closest analogue to the STL is MFBT: https://github.com/mozilla/gecko-dev/tree/master/mfbt

reply

---

operatorIdentity should explicitly take a type:

operatorIdentity int (+) == 0

---

https://godoc.org/golang.org/x/tools/go/types

---

we shouldn't need python's namedtuples b/c everything should be implicitly overridable by getters/setters

---

http://awesome-python.com/

---

One guy's current advice on Haskell libraries:

http://www.stephendiehl.com/posts/production.html : "

Prefer to import libraries as qualified. Typically this is just considered good practice for business logic libraries, it makes it easier to locate the source of symbol definitions. The only point of ambiguity I’ve seen is disagreement amongst developers on which core libraries are common enough to import unqualified and how to handle symbols. This ranges the full spectrum from fully qualifying everything (Control.Monad.>>=) to common things like (Data.Maybe.maybe) or just disambiguating names like (Map.lookup).

Consider rolling an internal prelude. As we’ve all learned the hard way, the Prelude is not your friend. The consensus historically has favored the “Small Prelude Assumption” which presupposes that tools get pushed out into third party modules, even the core tools that are necessary to do anything (text, bytestring, vector, etc). This makes life easier for library authors at the cost of some struggle for downstream users.

In practice any non-trivial business logic module can very easily have 100+ lines just of imports, and frankly it gets tiring. One common way of abstracting this is by rolling a custom prelude using module reexports. Consider a minimal use case like the following:

module MegaCorpPrelude? ( module Exports, ) where

import Data.Int as Exports import Data.Tuple as Exports import Data.Maybe as Exports import Data.String as Exports import Data.Foldable as Exports import Data.Traversable as Exports

import Control.Monad.Trans.Except as Exports (ExceptT?(ExceptT?), Except, except, runExcept, runExceptT, mapExcept, mapExceptT, withExcept, withExceptT)

This can be put into a cabal package which transitively pulls in the core dependencies and then is used in our downstream module.

{-# LANGUAGE NoImplicitPrelude? #-}

import MegaCorpPrelude?

There are several custom preludes that are available on Hackage in the Prelude category.

Haskell has world class libraries. There is an abundance of riches on Hackage in libraries like quickcheck, mtl, pipes, conduit, tasty, attoparsec, sbv and many more. Knowing where to start with the ecosystem can be a little tricky, and there are sometimes multiple competing solutions. A conservative start to a library might consist of something like the following build-depends in our cabal file:

  build-depends:       
    base                 >= 4.6   && <4.9,
    deepseq              >= 1.3   && <1.5,
    hashable             >= 1.2.2 && <1.3,
    text                 >= 1.1   && <1.3,
    bytestring           >= 0.10  && <0.11,
    split                >= 0.2   && <0.3,
    unordered-containers >= 0.2   && <0.3,
    containers           >= 0.5   && <0.6,
    vector               >= 0.11  && <0.12
    mtl                  >= 2.2   && <3.0,
    transformers         >= 0.4   && <0.6,
    time                 >= 1.6   && <1.7,
    process              >= 1.1   && <1.3,
    directory            >= 1.2   && <1.3,
    optparse-applicative >= 0.10  && <0.13"

"

Configuration For configuration Bryan’s configurator library is invaluable. The library specifies an external configuration flat file which can hold credentials, connections and cluster topology information. A typical pattern is to embed this in a ReaderT? and then asks for any field necessary in downstream logic.

newtype ConfigM? a = ConfigM? (ReaderT? ConnectInfo? a) deriving (Monad, MonadReader? ConnectInfo?)

handleConfig :: FilePath? -> IO ConnectInfo? handleConfig config_filename = do config <- Config.load [ Config.Required config_filename ]

    hostname <- Config.require config "database.hostname"
    username <- Config.require config "database.username"
    database <- Config.require config "database.database"
    password <- Config.require config "database.password"
    return $ ConnectInfo
     { connectHost     = hostname
     , connectUser     = username
     , connectDatabase = database
     , connectPort     = 5432
     , connectPassword = fromMaybe "" password
     }

The configuration file might look like the following:

database { hostname = "mydb.rds.amazonaws.com" database = "employees" username = "stephen" password = "hunter2" } "

" For large multi-package builds, I can’t speak highly enough of Neil Mitchell’s build system shake which is itself written in Haskell. The shake build uses Shakefiles which are monadic description of a graph of dependencies to resolve and their artifacts. For a contrived example consider running a Markdown file through Pandoc.

import Development.Shake import Development.Shake.FilePath?

main = shakeArgs shakeOptions $ do want ["book.html"] "book.html" *> \out -> do need ["book.md"] system' "pandoc" ["book.md","-o","book.html"] "

" Testing and building. For development builds using cabal sandboxes it’s usually essential to be able to pull in internal libraries that are not on Hackage. To do with cabal sandboxes this can be achieved with either a script to provision the dependencies.

$ git clone https://github.com/bscarlet/llvm-general $ cd llvm-general $ git checkout ca6489fdddde5c956a4032956e28099ff890a80b $ cd .. $ cabal sandbox add-source vendor/llvm-general-pure

With stack this can actually all be configured in the stack.yaml file.

packages:

Private TravisCI? or Codeship are not worth the trouble of setting up if one ever envisions the project spanning multiple repos. Getting their virtual machine provisioned with the proper credentials to pull from multiple Github repos is still a source of trouble. For build slaves and continuous integration I’ve used BuildBot? successfully to work with the usual cabal and stack toolchain. "

" Strings The strings types are mature, but unwieldy to work with in practice. It’s best to just make peace with the fact that in literally every module we’ll have boilerplate just to do simple manipulation and IO. OverloadedStrings? overcomes some of the issues, but it’s still annoying that you’ll end up playing string type-tetris a lot.

If you end up rolling a custom prelude it’s worth just correcting putStrLn and print to what they should be in a just world:

-- IO putStr :: MonadIO? m => Text -> m () putStr = liftIO . Data.Text.IO.putStr

putStrLn :: MonadIO? m => Text -> m () putStrLn = liftIO . Data.Text.IO.putStrLn

print :: (MonadIO? m, Show a) => a -> m () print = liftIO . Prelude.print

A common pattern is to use a multiparamter typeclass to do string conversions between all the common (Data.Text.Text, Data.Text.Lazy, Data.ByteString?.UTF8 Data.ByteString?.Lazy.UTF8, [Char]) types. You’ll end up eating at least one typeclass dictionary lookup per call to s but this is fairly benign in most cases.

class StringConvert? a b where s :: a -> b

instance (ToString? a, FromString? b) => StringConvert? a b where s = fromString . toString

instance FromString? UTF8.ByteString? where fromString = UTF8.fromString

instance FromString? LUTF8.ByteString? where fromString = LUTF8.fromString

instance ToString? UTF8.ByteString? where toString = UTF8.toString

instance ToString? LUTF8.ByteString? where toString = LUTF8.toString "


curated list of Haskell libraries:

https://github.com/Gabriel439/post-rfc/blob/master/sotu.md

another one (with other stuff mixed in):

http://dev.stephendiehl.com/hask/

a 2015 update: http://www.stephendiehl.com/posts/haskell_2016.html

a poll:

http://www.stephendiehl.com/posts/poll.html

---

https://godoc.org/golang.org/x/net/context

recc. by http://www.jtolds.com/writing/2016/03/go-channels-are-bad-and-you-should-feel-bad/ and https://news.ycombinator.com/item?id=11211222

---

Apple concurrency libs, XPC (high-level) and Grand Central Dispatch (low-level):

http://stackoverflow.com/questions/10373331/nsoperation-vs-grand-central-dispatch https://developer.apple.com/library/mac/documentation/MacOSX/Conceptual/BPSystemStartup/Chapters/CreatingXPCServices.html http://nshipster.com/inter-process-communication/ (section XPC)

XPC primitive types:

-- [3]

---

order of params convention in Haskell:

Fargren 1 day ago

Does the order of the parameters matter? It's been a while since I've touched Haskell, so maybe it's obvious that this isn't [a] -> a -> Bool for some idiomatic reason. I guess that would be `isContained` instead of `contains`, so it probably wouldn't be the first thing I search for, but there's at least some potential for ambiguity.

reply

cormacrelf 23 hours ago

It's almost a convention in Haskell. The idea is, since currying functions is easy and common, you stick the argument that you'd want the least in a curried version last. So if you wanted to find out if x was in ten lists, you'd just:

  map (elem x) [list1, list2, ..., list10]

or, for folds, the function is the one least likely to change, so you put it first, and the list is most likely to change, so you put it last:

  foldl (+) 0 [1,2,3]
  map (foldl (-) 100) [[1,2],[2,3],[4,19]]

Of course, it's always debatable if you can find some situation where you wanted to curry in a different order, but generally you pick in order to reduce forced named lambda parameters. As far as I can tell, that's how it's done.

Edit: if you saw the sneaky edit, you'll know that this particular convention isn't easy to follow!

reply

eru 17 hours ago

There are a bunch of rules of thumbs that help you decide faster. Like `needle before haystack'.

reply

---

a set of coreutils:

https://github.com/uutils/coreutils

---

http://magic.io/blog/uvloop-blazing-fast-python-networking/

" The asyncio module, introduced by PEP 3156, is a collection of network transports, protocols, and streams abstractions, with a pluggable event loop. The event loop is the heart of asyncio. It provides APIs for:

    scheduling calls,
    transmitting data over the network,
    performing DNS queries,
    handling OS signals,
    convenient abstractions to create servers and connections,
    working with subprocesses asynchronously.

uvloop is a drop-in replacement of the built-in asyncio event loop. "

---

some ES6 features mentioned in http://v8project.blogspot.com/2016/04/es6-es7-and-beyond.html: modules, classes, arrow functions, promises, iterators / generators, proxies, well-known symbols, ES6 RegExp? sticky flag.

proper tail calls

Some ES7 features mentioned in http://v8project.blogspot.com/2016/04/es6-es7-and-beyond.html : exponentiation operator Array.prototype.includes() async / await keywords, Object.values() / Object.entries(), String.prototype.padStart() / String.prototype.padEnd() and RegExp? lookbehind a mechanism for capturing unhandled Promise rejections like uncaught errors.

---

apparently this open source project implemented a bunch of Java libraries:

https://en.wikipedia.org/wiki/Apache_Harmony

(recc. by https://news.ycombinator.com/item?id=11679982 )

---

https://github.com/rails/rails/blob/6dec7645192bd89d81d47542255a346f93710830/activesupport/lib/active_support/core_ext/object/blank.rb#L99-L120

https://github.com/SamSaffron/fast_blank

---

example of some lib functionality we eventually need:

http://wiki.gkbrk.com/Hotel_Music.htm

---

https://github.com/andlabs/libui

---

a great study sheds light on which parts of POSIX are most used, which parts are unused, and which parts are being bypassed by higher level libraries due to their missing something: POSIX Abstractions in Modern Operating Systems: The Old, the New, and the Missing

"usage is driven by high-level frameworks...((in the areas of))...IPC, thread pool management, relational databases, and graphics support"

"extension APIs, namely ioctl , dominate mod- ern POSIX usage patterns as OS developers resort to them to build support for abstractions missing from the POSIX standard"

"For example, the ioctl interface is now regularly used to mediate complex graphics commands be- tween the high-level OpenGL?