Page 2 of 2

Re: Definition of variables

PostPosted: May 30th, 2015, 8:16 am
by Ajay Askoolum
There is a place for version control. I have used Visual Source Safe and derivatives in the past, as well as SVN; currently, I use Team Foundation Server, primarily for C# code.

For APL+Win, I find that a simple layered approach works for me.

1. In the first pass, simply calculate the checksum of like named functions; if different, then examine each in more detail.
2. I find that version control systems that have branching and merging capabilities not delivering their promise: manual interpretation is always necessary.

In general, having the like named objects in a database rather than in different workspaces simplifies the comparison.

On a more general note, I find that there is a concerted effort to introduce pseudo science in programming. In making this statement, I have in mind the proponents of programming methodologies (Agile, Waterfall etc.), testing frameworks (test driven development, mocking frameworks), and version control, of course.

It is interesting that no programming environment (interpreted or compiled) or other tool has as yet got the capability of certifying compliance (on a scale of 0 to 100, say) with either any programming methodology or testing framework or design (as in tiers) or DRY (Don't Repeat Yourself, i.e. design for code re-use) or design pattern or naming convention. This leads me to having some sympathies with the view point that programming, whatever its scale, remains a cottage industry; otherwise, we would surely have perfect code generators and translators (from one language to another). A while back, I used R: this language has a prime directive, namely, get the working code first. Everything else, like optimisation, runtime, etc. comes after. I've been using APL this way since I started.

Incidentally, I have been using the coexistence of differing versions of like named variables and functions to manage applications that have several deployment versions, namely, bespoke versions or simply versions that are using different databases. The deployment version is collated at runtime.

Version control (was: Definition of variables)

PostPosted: May 30th, 2015, 2:08 pm
by Davin Church
Yep - version control has both uses and issues. The automated merging you mentioned may work well enough for very verbose languages (where users are usually in different parts of the file), but it's useless for APL (and mostly for C# I would think).

I don't use any commercial/public version control packages in my system (which I called "ARM") - it's written from the ground up in pure APL and integrated tightly into the APL environment. Using it requires virtually no deviation from how you ordinarily work, and there's no "outside" work to be performed either (especially working with separate copies of code in text files). We currently have three programmers working on a big application and it works marvelously for keeping us synchronized and automatically preventing us from stepping on each other's toes (without explicit checkouts). But even for a single user it's terribly handy to be able to say "what did I change in that function" (which I tend to do surprisingly often).

I also use checksums for initial comparisons (for both functions and variables) and they perform well in that regard. Difference comparisons are done directly either between versions in ARM or between ARM and the workspace (implicitly or explicitly) and definitely make life easier than having to make separate copies of an object just for comparison.

I've still got expansion plans for ARM if I ever get more time to work on it. For instance, it doesn't currently handle external files, but I'd like to be able to deal with both text and binary data as well as APL component files (at a component level) and structured files (such as old-style function files of arbitrary types). I've also planned for user-created hooks into the system whereby a site administrator could write their own validation and compliance routines (in APL) and link them directly into ARM. Then anyone trying to apply an update would have to pass muster against the company standards before being accepted. That could at least provide some small measure of compliance with low-level coding practices, even if not the high-level conceptual styles you're looking for.

Version control (was: Definition of variables)

PostPosted: May 30th, 2015, 3:06 pm
by Ajay Askoolum
ARM sounds very interesting, perhaps an ideal topic for a future user conference. However, it would need to be a commercial product first. Any plans in that regard?

One of the buzz words in contemporary programming is 'continuous integration'. Basically, it means you check in your changes as often as possible and then re-compile the project; if compilation fails (because of changes), you need to fix it as your first priority.

APL is not compiled. In ARM, when do changes get committed to the repository? As changes happen or at programmer discretion?

Re: Definition of variables

PostPosted: May 30th, 2015, 8:35 pm
by Davin Church
Actually, I did give a short talk on ARM at the 2012 APL2000 conference (along with some nice differencing tools and a named-component file system, both of which it uses internally), and passed out copies to the attendees. However, I didn't get a very enthusiastic response from them. My best guess is that everyone there already likes doing things the way they do and aren't that interested in "improvements".

I'd be pleased to make it a commercial product if there was enough interest. But alas, I doubt there are enough APL programmers around to make it a viable off-the-shelf product - I'd have to charge more than most people would be willing to pay to keep it going. However, I could be talked into giving it away (without formal warranty responsibilities) if I could perhaps get a few donations from time to time to help pay for ongoing maintenance. Better yet, if a larger company wanted to use it but needed additional features (such as external file support) then if they paid for the extra development work (or shared it with other companies) then they'd get what they need and I would get an improved base product out of the deal that everyone could use. I don't suppose you know anyone that would be interested in that sort of thing, do you? Hmmm... I wonder if I could sell expensive copies to companies that included bug-fix support and free minor updates (to make them happy that it was supported) and at the same time I gave away free copies without on-demand support? Don't you think that might upset anyone that was paying for it?

ARM updates are made only when the programmer is ready. Updating could be automated upon editing, but I wouldn't recommend it for multi-programmer environments. The design philosophy is that development and testing is done in the programmer's workspace, as has always been done. When the programmer is happy that a new change works, they just issue the command ]ARM PUT * (even from a function key) and it's all saved away. That way everything that's in ARM, and that is thereafter given automatically to the other programmers to run for themselves, is considered to be tested and working (i.e. continuously integrated). Otherwise you'd have people putting in half-finished code and possibly breaking the application for other programmers who are testing their own work. Of course ARM includes many other available options for more complicated situations and handles development environments of any size and shape.

FYI -- I got into the discussion about ARM because it has to do a lot of the same kind of variable-manipulation (and multiple versions) work that you were inquiring about at the beginning of your thread.