0

Getting them as search results or following reference links, I often come across documentation produced some time ago. A few months, a few years. Sometimes the documents are marked as deprecated or otherwise untimely, but not always.

But even if historical, to what extent are they still applicable. Is it possible to draw a general map of the *nix fields' change dynamics? How likely are certain areas to have changed (or to change in the future), how much and possibly how often?

This question is similar to How updated and relevant is "The Linux Documentation Project"?, but it is more general. I'm asking for base lines like:

(these are just examples with no binds to reality)

  • "The main filesystem structure has not changed for 25 years and will not change for the next 25, as it's guaranteed by POSIX, because that would turn the *nix world upside down. You can rely on documentation up to 25 years old."

  • "The C libraries' API changes every few months, especially in the area of networking, as it's in frantic bloom. But there's a bunch of core functions, which are considered a standard, have been kept unchanged ever since, and will never change, cause that's reinvention of the wheel."

Indirectly, this question also asks to what extent tips on a Q&A sites such as this one stay valid over time.

2 Answers2

2

The two quotes you posted sounds like someone's opinion, not like "documentation".

Standards are moving targets. One may aim to implement them, but every few years they get updated, and there will always be extensions and instances where a standard is purposely not followed. This goes for "proper standards" as well as for "ad hoc standards".

Standards like POSIX, though, changes only slowly over time, with new things being introduced as they are found useful, in demand and widely accepted, and old things being dropped as they are found falling out of use etc. This is why some here on U&L are keen to point out that some shell things (for example) aren't POSIX compatible, but only works (or not, as the case may be) with certain versions of certain implementations of the tools. It's a way of making an answer live longer than a particular implementation offered by GNU or some other vendor.

The best place to look for documentation about the interfaces and tools on your system will always be the on-line manuals on your system, as well as any other form of documentation possibly distributed with the software you're using.

You may search the web for what a certain flag to the sed or grep utilities do, or how to use Getopt::Long in Perl, but it will be the manual installed with the actual utility or library on your system that is the definite documentation.

Historical documentation is useful for people running those same old systems, there's nothing denying it, but a piece of documentation is usually intended for a system or tool as it were at the time of writing. If you're on a fresh OpenBSD 5.9 system, for example, and wonder why sudo doesn't work, because you're reading a web page saying it's supposed to be in the base system, well, it would have been better if you've read the afterboot(8) manual (as prompted) that describes the system you've just installed. This will tell you about the doas utility.

My point is, it doesn't matter where Unix is going or were it's coming from. You're in front of a machine running it right now, and there's where you'll find the most up-to-date documentation.

If you see something on the web describing how to set up UUCP for exchanging emails between hosts, or how to do database replication in PostgreSQL, you have to take into account what the intended audience of the document is and that these tools may be and behave differently for you, due to time. The manuals on your system has you, the user wanting to know how to use the system which is in front of you, as the intended audience.

I'm sorry if I missed the mark a bit with my answer, but it's something I've been thinking about a bit lately.

Kusalananda
  • 320,670
  • 36
  • 633
  • 936
  • Thought not what I asked for, your answer is valuable, and I will upvote it tomorrow, as today I run out of votes :-) My first thought after reading your answer is that your view is extreme in its skepticism. For an argument, I'll refer to Java. In Java, there is no guarantee now that in the future versions of Java every object will extend the `Object`. But that's what makes Java Java. That's the model on which it bases. Similarly the method `toString` might be replaced. But how likely is that? Sorry for this analogy, you may not know Java. But my question is about drawing these kind of lines. –  Aug 15 '16 at 16:16
  • I'd extend this answer to the official documentation for the same version (potentially +- a bit). Particularly reference documentation (as opposed to e.g. tutorials). Plus, you can look up the reference documentation for the options you would use, for example. It can be very informative, and it should show you if the option has been removed, or an alternative is now recommended. – sourcejedi Aug 15 '16 at 16:24
1

I would say no, you can't a priori draw a map to navigate this safely.

You mentioned Q&A sites. IME Q&A forums are often terrible at this. You get a smatter of opinions, but not fully reasoned explanations. Third-party "documentation" turned up by web searches, like people's blog posts, will often have a similar quality. Granted the answers are useful at the time; they let you see the experiences and readings other people have. But a reasoned explanation, with reference to the primary sources, can be educational even when it's completely obsolete.

Since you ask. I'd say POSIX is about it. A multitude of vendors standardized 1) system calls and 2) utility commands, and these will stay functional in the future to preserve compatibility with existing applications.

Again, remember that the authority is the standard itself. My CS course at a highly ranked university conflated the POSIX threads standard with Linux's initial, inconsistent attempt at implementing it. Ignoring the counterexample of the second implementation (NPTL). And material from these courses is often made available online...

The problem is once it's been agreed and enshrined in standard, it doesn't necessarily stay relevant and intereresting. I feel the failure of the Linux Standard Base would be an example of this. (Note that recently comparable efforts like xdg flatpack apps, are built against versioned runtimes. And look at how fast GTK is being changed).

I think looking at security provides strong examples. We just haven't worked out how to build a secure system yet. Systems that are old / unpatched / haven't had mitigations applied for the bug du jour are considered completely broken. So they change, constantly.

Caveat to the POSIX love: operating systems used in the real world will deviate from the standard in somehow. OS X, POSIX certified - fsync() implementation has been lawyered out of doing what everyone else considers the intended meaning. Certain Linux greybeards argue we should break apps that use annoying filenames e.g. that include control characters. Etc.

sourcejedi
  • 48,311
  • 17
  • 143
  • 296