Programming Things I Wish I Knew Earlier 590
theodp writes "Raw intellect ain't always all it's cracked up to be, advises Ted Dziuba in his introduction to Programming Things I Wish I Knew Earlier, so don't be too stubborn to learn the things that can save you from the headaches of over-engineering. Here's some sample how-to-avoid-over-complicating-things advice: 'If Linux can do it, you shouldn't. Don't use Hadoop MapReduce until you have a solid reason why xargs won't solve your problem. Don't implement your own lockservice when Linux's advisory file locking works just fine. Don't do image processing work with PIL unless you have proven that command-line ImageMagick won't do the job. Modern Linux distributions are capable of a lot, and most hard problems are already solved for you. You just need to know where to look.' Any cautionary tips you'd like to share from your own experience?"
Comment your code (Score:5, Insightful)
Re:Comment your code (Score:5, Funny)
I indented the code to make it readable. That's so obvious I don't need a comment to remind me.
Re:Comment your code (Score:5, Funny)
I don't know. I prefer to comment, just in case:
while 1:
# Indentation
dosomething()
Re:Comment your code (Score:5, Funny)
I indented the code to make it readable. That's so obvious I don't need a comment to remind me.
(pun indented)
Re: (Score:3, Funny)
Re: (Score:3, Insightful)
That's what the GP meant. Comment why, not how.
You only need how when you do something that people might miss, such as a switch fall-through, or not understand, like some tricky Boolean operation.
Everything else should be about what the code is attempting to do, and why it's trying to do it that way. When that's obvious, you don't need any comments at all. (Although function headings are still a good idea.)
Re: (Score:2, Funny)
"indented the code to do"
So the real question is tabs or spaces?
Re: (Score:2, Funny)
> Remember that comments are not for describing what the code technically does (that is what the code is for), comments are for what the code is intended to do.
Corollary: Never debug comments, only code.
Re:Comment your code (Score:5, Insightful)
Re:Comment your code (Score:5, Funny)
dead comments mislead the person following later into believing a lie a lie that could potentially have major impacts on the software.
// the following code delivers cake to the subject
Re:Comment your code (Score:5, Funny)
// the following code delivers cake to the subject
// the above comment explains the joke
== changelog ==
* removed redundant comments
Re: (Score:3, Funny)
Re: (Score:3, Interesting)
That's because people who don't know what they are doing don't know that they don't know what they're doing. Those types of comments should be accompanied by a clear competence or acceptance test. For example, the last such comment I wrote went something like: /* This might look like an unnecessary delay, but the timing has been carefully calibrated against a wide range of marginal real world conditions. If you touch this function, you must ensure it does not time out under these configurations... */
In
Re: (Score:3, Interesting)
Re: (Score:2)
Re:Comment your code (Score:5, Insightful)
Commenting code isn't enough, it's just a small part of the design and documentation process. Comments are there to tie the code to the relevant part in your design document, which really is a part of programming people should put more effort into.
Re:Comment your code (Score:5, Interesting)
It's been said for years, but it is almost never done. When it is done, it's most often (IME) done _after the fact_ because of some requirement to produce the paperwork. Perhaps it's time to give up on it. Is there a real reason for insisting on a design document, or is it just some sort of self-flagellation on the part of programmers?
Re:Comment your code (Score:4, Interesting)
I have tried implementing a 'design document' process for the better part of 10 years that I've been with this group. It's never gotten done. We came close about a year ago. Here's why it's I still try (while knowing that it'll never get done):
There's a reason architects use blueprints.
Re:Comment your code (Score:5, Insightful)
Which is? If it's along the lines of "It's easier and cheaper to fix a blueprint than a building", then it does not apply to software.
Re: (Score:3, Interesting)
Re:Comment your code (Score:5, Informative)
Why would you be giving up screen space with tabs? On most text editors for programming people can adjust the tabs to display as whatever indentation they want. So if someone wants to view tabs as 2 spaces and another prefers 4 and another prefers 8, they don't have to change anything in the code if their editors are already set up for that.
The disadvantage of tabs is where you have some stuff that requires you to use spaces in (text string etc) and for some reason you want certain parts of it to align with your indents. Or you are mixing your indentation - spaces and tabs (try not to do that OK? ).
Re:Comment your code (Score:5, Insightful)
Put enough comments in your code so that five years from now you (and others) can remember what you indented [sic] the code to do.
But not so many that you (or others) will find it more work than it's worth to change the comments when the code changes.
I prefer code with no comments to code with actively misleading comments, and I hate code with no comments! :)
Re:Comment your code (Score:5, Interesting)
But not so many that you (or others) will find it more work than it's worth to change the comments when the code changes.
I prefer code with no comments to code with actively misleading comments, and I hate code with no comments! :)
The trick is that if you are writing comments describing what the code is intended to do, you can write those comments in something like JML [ucf.edu] or Frama-C [frama-c.com]'s ACSL. That was you can use ESC/Java2 and Junit, or Frama-C, to do your checking that the code does what you intended. You get two benefits: more rigorous checks on your code (including use of theorem provers from ESC/Java and Frama-C); if your documentation ever falls out of date with the code, you'll immediately get errors flagged.
Re: (Score:3, Informative)
I really think that means that you need to become better acquainted with those tools. specifying what you intend to happen is a lot easier than specifying how to do it. The first is a specification, the second is an implementation. Consider, I can specify a sqrt function by simply saying the result squared should be within some error tolerance of the input; that is a long way from writing an implementation of a square root function. Likewise, I can specify a sort function (the output list should contain exa
Re: (Score:2, Insightful)
This is the best, IMHO. I, too, have looked code and gone, "WTF was this idiot thinking???? ...Oh wait... that's my code."
And agreed - dead comments have to be updated. I didn't always do it perfectly, but I pretty much got in the habit a few years ago of commenting as I go, right by the code in question. One of those times when it's good to be a bit verbose.
Re:Comment your code (Score:5, Insightful)
I really need to find some kind of "idiots guide to when to comment and when NOT to comment."
Re:Comment your code (Score:5, Insightful)
Think of comments as an API documentation*. If something complicated is going on, I usually include several usage example as well. Just look at the documantation 3rd party libraries, and try to follow that granularity and style.
Re: (Score:3, Insightful)
In other words you document the intent of every paragraph of code, and a high level explanation of how it does it.
No offense, but I despise that style of commenting. I can tell what your code is doing by reading the code. What I don't know is why you're doing it that way. Why are you looking for needles? Have you proven needles are the only thing in your haystack made out of iron so that the code won't detect nails, screws, and hatchets six months from now? Is there a reason your haystack is a list and not a binary tree? These are the things I need to know to maintain your code. I don't need to be told how a loop works
Re: (Score:3, Informative)
x = y + 4;
it's more like
x = y+4;
So you explain *what* you're doing in HIGH LEVEL terms, and set CONTEXT. And you don't comment each line, you com
Re:Comment your code (Score:5, Funny)
Clearly, this is why you need comments! Haystacks aren't lists - they're heaps.
Re: (Score:2)
The problem is that it's easy to follow this rule and forget another important one: If the code you are writing needs a lot of comments, chances are you need to try a different solution, because you are trying one that has the complexity in the wrong places.
The best code has enough comments to be understandable, but it also doesn't have anywhere near as many comments as there is code.
Re:Comment your code (Score:5, Funny)
Put enough comments in your code so that five years from now you (and others) can remember what you indented the code to do.
I know, Python right?
Re: (Score:3, Insightful)
My current annoyance with python:
Re:Comment your code (Score:4, Insightful)
Solution: Stop mindlessly copying and pasting.
I lost track of the amount of code I've seen that is a straight duplicate from some area that does vaguely the same thing. Then all the variable names don't match the new context, comments are not applicable and assumptions are not necessarily correct. Doesn't matter what language it is written in.
Of course the person that does it never bothered to look through it and only relied on compiler error messages to detect the obvious "not declared" errors. What ends up is a mess that is nearly guaranteed to have bugs.
If tabbing the new pasted code into place forces you to actually read it and think about what you just did: then it is well worth it!
Re: (Score:3, Interesting)
Put enough comments in your code so that five years from now you (and others) can remember what you indented the code to do.
Re:Comment your code (Score:5, Interesting)
If you are new to coding, don't be a bedroom programmer. You are no longer writing a 10,000 line app alone in your bedroom. You may be working on a million line app with a team. Change your habits accordingly. Learn to work with other people.
Programming is one of those things that humans are not quite smart enough to do. This means you. Check your ego at the door. In the early 90's, IBM estimated that 80% of large projects in the industry (one million lines or more) were "abandoned in disgust". This should give you some idea of what you are up against.
Come to work knowing what you are doing. This may mean cramming in your off hours. Don't say that you don't know how to do something. Say that you do and then learn it!
Put in comments where they are needed, and maintain them. You will forget what you were doing within three months. The harder it was to code, the more you need the comments.
Use descriptive variable names. Try to organize your data into conceptually simple variables where possible.
If you have to complicate a mathematical formula by breaking it into sections appropriate for inner and outer loops, put the formula in the comments. It may even be worth putting in an ASCII diagram if you are working with geometry.
If you can't see the bug, it's because you have become blind to the code. Get someone else to take a look. The mistake may be embarrassingly obvious to a new set of eyes.
If speed is a factor, preprocess the data. Offload runtime cycles to preprocessing.
Maintain an up to date user manual for all tools and apps. Add to it as you add features, update it as you update the features.
Avoid magic numbers where possible, and put any magic numbers you do use into defines, again with descriptive names.
If you can, avoid virtual methods and pointers in streamed objects. This way you can bulk load them and bulk write them. Indices often fast enough, or can be converted to pointers if need be after loading.
If you have lots of booleans, consider a bit array.
Try to write reusable code. Code for the general case when possible, but...
Normalize your data and objects. Don't waste memory and time maintaining variables you don't need. Don't repeat yourself.
Your key indexes should be integers, never strings. Yes, I have seen databases keyed on memo fields--they were tragically slow.
If updating an existing project, get the client to sign off on what is not to be changed or fixed, and make certain that the QA department gets this list. Otherwise bugs will creep onto the list that you are not actually required to fix, expanding the scope of the project.
Build test harnesses whenever you can which can be turned on with a simple switch. This will make regression testing a lot easier.
Re:Comment your code (Score:4, Insightful)
Come to work knowing what you are doing. This may mean cramming in your off hours. Don't say that you don't know how to do something. Say that you do and then learn it!
Please ignore that. Honesty is always better - and if your team doesn't welcome honesty, try to change the team's attitude. If you trust each other your results will be better. If you don't know how to do something, say you don't and THEN learn it.
Re: (Score:3, Informative)
Re: (Score:3, Insightful)
Sure, assuming that everyone who ever uses that code is using the same code repository.
I've had to modify code professionally that was written over 30 years before I encountered it (older Fortran 66 stuff), and the comments in the code were invaluable to me. The folks who wrote it had no idea that their code would be running at another company when it was written.
Re:Comment your code (Score:5, Insightful)
When code and comments disagree, both are probably wrong. -- Norm Schryer
Re: (Score:3, Insightful)
Yes, because what's clear to you is going to be clear to everyone else who ever reads it. And it'll be perfectly clear to you in 6 months or 6 years when you try to alter it. Yup.
There is no such thing as self documenting code. If you ignore comments and/or documentation, you're a shit poor programmer and a blight on the profession.
The hard way is more fun (Score:5, Interesting)
The truth is that the "hard" way of doing things is often more fun, because you have the challenge of learning a new tool or API. Plus sometimes it's actually easier in the long run because you've engineered a solution for the outer bounds conditions of scalability, so if your application takes off, it can handle the load.
I guess the real issue is that you have to engineer a "good enough" solution rather than a "worst case" solution.
Re: (Score:2)
Re: (Score:2)
Also, doing something the "hard way" the first time often leads to a greater understanding and appreciation of the "easy way."
Reading the article, I fail to see why I should avoid PIL for ImageMagick. In neither case is Linux going to just "do it" for me. And in either case I have to tell PIL or ImageMagick how to process my images, right?
Re: (Score:3, Insightful)
Well if you are coding professionally. You are wasting time going the hard route unless there is a real good reason to do such. Newbees want to show off that they can do all these things. Mature programmers know they can but if there is an easier way then go for it as it gets your project done faster and you look better to your boss who can give a rats ass on how you did it. Just as long as there isn't any consequences in the future.
Re: (Score:3, Informative)
Re:The hard way is more fun (Score:5, Insightful)
Re: (Score:3, Insightful)
Re:The hard way is more fun^H^H^H educational (Score:2)
Re: (Score:2)
The problem is that you're far more likely to end up with crappy code. If you want to ask your user for a password, it's insanely much easier to do a system("stty -echo"); instead of learning the ioctls to accomplish the same thing, plus you get portability for free.
I recently replaced a SMTP protocol implementation with a pipe to /usr/sbin/sendmail. Guess what? It worked much better.
Re:The hard way is more fun (Score:4, Interesting)
Actually I volunteer for the projects that are going to expose me to something new, rather than only taking on projects where I already know how the solution will work. The latter are bread-n-butter to the company, the former are the future of the company.
For example, I've spent the past year on a Freeswitch project rather than on the older Asterisk based code. Freeswitch scales better, is better architected, and is more flexible. The downside was spending 3-6 months working with the Freeswitch team to resolve issues with the code.
In the end, Freeswitch is where we are going; Asterisk is where we were. At the time that the Asterisk code was started, Freeswitch hadn't even reached it's first release, so Freeswitch wasn't an option back then.
Next up is a rework of the database IO codebase so that it becomes feasible to plug-n-play different databases. We could do it with the existing code base, but it would be very painful, kludgy, and difficult to maintain. Instead we're going to make a clean break on our next release to a new architecture for the database code. Sure it'll take longer at first -- but by the time we're on to our third database we should be well ahead of the curve and saving time.
Re: (Score:3, Insightful)
Hardware can certainly be the correct answer to the problem. Are you disk-bound? Then you need faster disks. Can't fix that in software. Saturating your pipe? You need more bandwidth.
Yes, you can do things to try and cut back on your use of those resources, but that's ultimately going to produce much less effective results unless you're at massive scale (read: you have your own datacenter) where a 1% performance gain frees up the equivalent of several hundred servers.
Hardware is also a good way to buy yours
Lesson #8 (Score:5, Insightful)
Don't ask for advice about programming on slashdot unless you have a pile of salt grains ready.
Re:Lesson #8 (Score:5, Insightful)
Re:Lesson #8 (Score:5, Funny)
The reason are two fold:
1) The amount of morons *always* exceeds the amount of experienced and reasonable people
2) Usually, morons don't have a clue that they... well... are morons!
I for one can assure you that I'm NOT a moron :-)
Re: (Score:3, Funny)
There's no substitute for a few solid courses in theory and design.
You must be new here. When I started programming, there were only a handful of colleges that had computer programming departments.
You came along late enough that you might have had decent instructors.
And I walked to and from college both ways, uphill, in the snow, barefooted, my first programming class used paper tape, my second programming class used punch cards, yadda yadda yadda.
Now, get off my lawn!
3 things I've learned. (Score:4, Insightful)
Sometimes it's easier and faster to code from scratch than it is to use off-the shelf software - especially in the age of "frameworks".
In that train of thought, its often better to toss and rewrite (or write new programs) than it is to extend existing programs.
It's easier to implement a whole new framework than it is to convince your boss that writing anew is actually faster.
Re:3 things I've learned. (Score:5, Informative)
I like this take on it [yosefk.com]: redundancy is bad, but the primary way of avoiding redundancy, factoring things out into libraries or frameworks, introduces dependencies, which are also bad. Which is worse? Depends, but I think dependencies are worse in more cases than people believe. I personally like to depend on something only when it's a big enough chunk of code to be worth adding a dependency for, has a clean API, and has an active and interested maintainer.
Making use of a database (Score:5, Insightful)
A database is not a bitbucket. Re-building basic database functionality in an external app is not a good idea. Applications, frameworks, languages come and go; data remains forever [1]. Business logic is part of the database. If you find yourself adding more and more "application servers" to get performance than you have a fundamental problem with your architecture (and probably a fundamental misunderstanding of how databases work). While it is not impossible to learn and implement good data management/database development practices using Microsoft tools, such a result is seldom seen in the wild.
sPh
[1] Per Tom Kyte of Oracle, whose first database job at the Department of Agriculture involved working with datasets stretching back to 1790.
Re:Making use of a database (Score:5, Insightful)
> Business logic is part of the database.
I used to think the same thing, but that was before I ever had to solve a no-trivial problem.
What do I think now? Basically that business logic belongs in a domain layer, but that shouldn't mean that the database is treated like a black box. You should make use of constraints within the database server to ensure a base level of data consistancy. Default values (especially on auditing columns), correct datatypes, relationships, versioning and not-null constraints all belong within the database. Basically anything which can be supported without resorting to triggers or stored procedures.
Databases are excellent at sorting and filtering data. A massive amount of engineering talent has been invested in shaving miliseconds off of their algorithms. So let your database handle that. Don't filter in the application layer and then start complaining about performance. Need more capacity? Cluster. Throw more databases at the problem. Not more application servers. Make sure they're running on properly specified physical hardware, built to best match the actual usage pattern of the database.
Re: (Score:3, Insightful)
Main reason is that you're hiding logic in the data layer. You don't want to do that. You should encapsulate all business logic in one place. Which, for systems of a certain complexity, is a domain model / layer.
Stored procedures are generally hard to test, and, unless you're using something like SQL Server's CLR integration the available functionality is limited. Not being object oriented, it's harder to manage complexity.
I've built some pretty complex logic in the past with SPs, and they're extremely
no MVC pattern... (Score:2)
unless tcl/tk wont do the job
2-port programs, Linux, PIL, expensive hardware (Score:5, Insightful)
If you are writing a program that touches more than two persistent data stores, it is too complicated.
I disagree. Is a program too complicated if it has 1. input, 2. output, and 3. logging? Is a program to prepare images for an online store too complicated if it reads 1. raw source images and 2. an overlay image and writes 3. finished images?
If Linux can do it, you shouldn't.
That'd be fine if we all ran Linux. But in an organization that already has to run Microsoft Access for other reasons, we have to take Windows into consideration. And I don't think Ted Dziuba was talking about just using Windows as a shell to run Linux in VirtualBox OSE either.
Don't do image processing work with PIL unless you have proven that command-line ImageMagick won't do the job.
Our programmer is far more experienced in Python than in bash, and if I felt like it, I could benchmark PIL against subprocess.Popen(['convert', ...]).
if the physical machine is not the bottleneck, do not split the work to multiple physical machines.
Yet PC game developers split a 4-player game across four PCs when one could do, and increasingly, PS3 and Xbox 360 game developers are following the same path.
It is far more efficient to buy your way out of a performance problem than it is to rewrite software. When running your app on commodity hardware, don't expect anything better than commodity performance.
If you are writing software to be used internally, sometimes springing for better hardware is worth it. But if you are writing software to distribute to the public, you can generally assume your customer has commodity hardware unless your software costs at least 1000 USD a seat.
I RTFA (Score:4, Informative)
I think that anything that reads "___ things to know about ____" or similar gets instant hits on
-1, Boring from me; hope it helps others.
I wish I'd known that CMS's are really hard (Score:3, Informative)
And that's before you get into the really difficult stuff (that very few have managed to master) of getting a website that is easy to navigate and intuitive to use
What I want people to remember (Score:5, Informative)
Don't assume that, even six months from now, you're going to remember why you did things a certain way.
And the corollary: Don't assume you're going to be the one modifying the code a year or two from now.
Either way: Add comments liberally. Even if you're a conservative.
Thing I wish _others_ would know (Score:5, Insightful)
Do not make things super-modular and generic unless they 100% have to be. In 99.9% of the projects no one, including yourself, will use your stupid dependency injection, and logging / access control can be done just fine without AOP. Don't layer patterns where there's no need. Aim for the simplest possible design that will work. Don't overemphasize extensibility and flexibility, unless you KNOW you will need it, follow the YAGNI principle (you ain't gonna need it).
Re: (Score:3, Insightful)
(too lazy to log in)
hear hear.
i'm working in an over-architected codebase now and all the Design Patterns thrown in haven't done squat for making it flexible or modular,
they've just made it obtuse.
This is te problem with Linux (Score:4, Insightful)
"Modern Linux distributions are capable of a lot, and most hard problems are already solved for you. You just need to know where to look."
First off, I must say this piece says a lot about the Linux ecosystem. Specifically that this system's documentation is anemic at best. Why won't we have something like:
"What do you want to do?...with an associated answer...this kind of arrangement surely cannot hurt the Linux ecosystem.
Re: (Score:3, Informative)
Overengineering can be a good thing (Score:5, Insightful)
You know, I find that as I get older, I am able to avoid overengineering things a lot better than when I was twenty something. There's nasty effect, though. I'm learning a lot less in depth about systems than I normally would.
Overengineering is terrible for a project, but it often is highly educational.
One more tip (Score:3, Informative)
Visual Basic - Don't. Just don't.
It's always great to learn a language then have the company change it so drastically in the next version that all your knowledge of the language is useless. I don't believe it'll be the last time that happens either. I do know I will never bother to learn another MS programming language again.
Good luck to all you C# programmers when they switch to C#.NET, or whatever they call the next one. Hope you like reading!
Re: (Score:3, Insightful)
Right - but a real programmer realizes that languages always come and go, that the real skill is PROGRAMMING not C# or Java or C++ or Python or whatever. If you have a real understanding of the fundamentals then learning a new language is usually easy (and often kinda fun).
Re:One more tip (Score:5, Informative)
Visual Basic - Don't. Just don't.
It's always great to learn a language then have the company change it so drastically in the next version that all your knowledge of the language is useless. I don't believe it'll be the last time that happens either. I do know I will never bother to learn another MS programming language again.
Good luck to all you C# programmers when they switch to C#.NET, or whatever they call the next one. Hope you like reading!
C# is already .NET, there has never been any other version.
And what about C++, hmm? That's a language that went from "C with classes" to a multi-paradigm language with a sprinkling of template mate-programming programming thrown in! The shift was so huge, in most compilers, the standard libraries have two versions, a legacy and a modern template version.
I had to re-learn C++ several times. Some of the new features are so advanced, that not even all compilers support it. Things like "partial template function specialization" are only turning up now, 10 years after standardization. It's a language where I was shocked to learn that there were entire language features I wasn't even aware of after having used the language in production code for 20 years! That kind of thing has never happened to me before or since with any other language.
Re:One more tip (Score:5, Funny)
Java :)
Yes, and no (Score:3, Informative)
Some oversimplified philosphy, some good hints. Programmers and SysAdmins who do a lot of resource management eventually become managers. This isn't neccessarily a bad thing, as the world needs more managers with extensive experience with that which they are managing, and the respect of those people they are managing. It's true that it's silly to adopt some software, technology or process just because it's new. But Ted seems to be resistant to any change, which is not good either. The problem with "don't fix it if it ain't broke" reasoning is..what do you do when it eventually breaks? This is a mistake made by many in process control / automation eniveronments: failure of a part which is so obsolete that it has become difficult and expensive to obtain a replacement. Just try to find a new motherboard with an ISA bus these days. Or a composite monitor. The same thing can happen with software and the OS..where are you going to find a guy who knows enough about that old Kaypro which was running some COBOL software on CP/M, which controlled the electroplating machinery? This is why companies have lifecycle management, so that the pain of switching to newer software / hardware comes with predictable cost and timetables instead of sudden, possibly prolonged unavailability and expensive, awkward, band-aid fixes.
This flows into the idea of organizational amnesia, where important processes become lost. This is perhaps best illustrated by the US DoE forgetting how to make this secret substance called FOGBANK, which is a critical component of H-bombs. Upper management felt as though, because there was no need for additional H-bombs, the process was unimportant, and didn't take into account that H-bombs become (more) dangerous with advancing age, and eventually these needed to be replaced. It took considerable time and money to re-engineer FOGBANK.
These are both examples of failure to consider that all equipment wears out, and failure to plan for long-term needs.
Some tips from a C guy. (Score:4, Interesting)
I know these might sound odd but hear me out. Start by trying to rewrite the basic library's. make your own printf, strcpy. strlen etc..... Write copies of your own link list and tree storage methods and above all really really start to understand how memory works.
Another really really important thing that I have learned is stay FAR FAR away from OO programming until your really really comfortable in lower level languages. The reason is that to many students and beginners sit there trying to figure out why there variable started with value X and ended up with value Y only to find out that there object bashed some memory earlier on.
Basically just grab a good C compiler, I mean C COMPILER, not C++, not C#, not F# and start to learn how all the functions you use on a dailiy basis work, it will give you new insight into why and how you can quickly avoid and fix problems when and before they happen. It's also important to get a really really good handle on using a CLI over a GUI, Stay away from Visual Studio and other simular compilers. Use GCC and CC and make sure you look at how LD is working and understand how compilers do optimizations and improvements to your own code. This post is taking about grabbing tool to do Image processing and preform functions that have working solutions. However taking the time to see how the solutions work and why they work will give you good insight into not only great code design but great programming methods. It might seem odd for me to suggest to a beginner to try and rewrite strcpy or strcmp, but once you see how they really work you'll be far less likely to make the simple mistakes that can ground your program / project from working. It's the same say with with a beginner figuring out how malloc works and where memory gets taken from and put to, all of these suggestions are coming from the way I learned to program in C and other languages.
Feel free to throw any of these away or take any of them into your own programming adventure, but one thing is for sure. When you can figure out how the basic functions you use every day work it will save you hours and days of trouble shooting and leave you with a greater pallet of tools to use in the class room and on the job. I welcome and one who wants to add ideas to this post and attack it with there own view points.
Re:Some tips from a C guy. (Score:5, Insightful)
Comment your data too! (Score:5, Insightful)
I'm in a different boat from most commenters here, I think, because I am a scientist writing simulations; some simluations run a long time and create a lot of data which would be costly to reproduce, and what I wish someone had told me early on was that I should comment my *data files*, not just my code. Each file should include the exact parameters used to create it, an explanation of what each column represents, and preferably there should be a way of knowing what version of your simulation code was used to create it. A couple of times in grad school I had toss out months of data after I discovered a bug in my code, and didn't know when the bug showed up and which data was affected by it.
(I'd welcome other advice from simulationists too; I've never had an advisor who was particularly programming-savvy, even though programming was always a large part of my research, and so I always had to make it up as I went along.)
Re: (Score:3, Informative)
The boss wanted everything in XML, since that was extensible, but then went halfway because raw images don't encode well in XML. So we maintained the dataset in XML and binary.
But as a result, I was able to keep around all versions of the binary-to-xml converter in the current code base. With some unit tests, and some comments, it really helpe
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
Yes, and if you use units (and generally you do) then make it clear what units each parameter expects. I am in crypto and I am always guessing if things are in bits or bytes. In physics it is even more important - fortunately students of physics will probably be more inclined to describe the units they expect. Also, the names of the parameters are even more important than those in the source files. Basically, if you have e.g. a configuration file, it should be thought of as part of the user interface.
Re: (Score:3, Interesting)
Here's another tip, which I also thought about last time someone asked slashdot about scientific data organization: keep a wiki, and write down all the things you do (at least everything that isn't trivial to reproduce) there. Commands, paramaters used, input and output files crea
Emotional Things I Wish I Knew Earlier (Score:3, Insightful)
we could not figure out whether the author was an incredibly elaborate troll or just a run-of-the-mill idiot.
Reading this comment of his reminds me of something I read recently:
Physicists stand on each other's shoulders. Engineers dig each other's graves.
I've never understood why so many software developers feel the need to disparage one another in an attempt to prove their intelligence/superiority. There are plenty of tough problems out there and we all can learn something from one another, no? I've definitely been guilty of this in my tech career but lately I'm wondering more and more, why does the person who has a different solution always have to be an "idiot?" Why isn't he/she just someone who has a different take on solving this particular problem?
Now, I'm not saying that engineers do this more than any other group but out of all of my friends (some of whom are doctors, lawyers, teachers, etc.) it certainly seems like a more common event among software developers.
Re:Emotional Things I Wish I Knew Earlier (Score:5, Insightful)
I think there are several factors that contribute to this:
1. Programming is a very popular and easy to enter field.
2. It's actually pretty easy to get by as a programmer without really understanding what you are doing.
3. Regardless of how much you hear about it, modularity, reusability, and highly structured programming do not have good penetration in Software Engineering.
4. Because of #3, it is all to easy for otherwise competent programmers to paint themselves into a corner and generate software with really messy architecture and/or implementation.
5. Programmers OFTEN have to clean up after other programmers.
So, due to #1 and #2, there actually are quite a number of really bad programmers running around.
Due to #3 and #4, there are quite a number of otherwise decent programmers who produce working but unmaintainable code.
Due to #5, most programmers have ample opportunity to experience a great deal of pain from other programmer's incompetence.
Due to human nature, programmers tend to assume that all that bad code comes from #1 and #2 rather than #3 and #4.
And more speculatively and unrelated to the above:
6. Lots of programmers tend to hang out on Usenet, internet fora, mailing lists, and IRC, where harsh criticism is de rigeur, and internalize the habit of harsh criticism in their professional lives.
Version Control (Score:5, Insightful)
Put everything in version control. Everything. EVERYTHING!
Well. You could skip /home, but I know a roll back of /etc has saved me a couple of times on config upgrades.
Remember that once code is deleted, you can't get it back. However, version control changes that. Version control is one of the most vital tools for anyone developing/working with a computer.
Oh and git rocks and stuff :)
LISP/Scheme (Score:5, Informative)
I wish I'd know about LISP 25 years ago. Stupid people told me it was "for processing lists." If only I'd known better. Functional programming gives you wings and a jet engine.
I wish I hadn't paid too much attention to people with limited imaginations. Just because they're older, have more money and shout louder doesn't mean they are clever or wise.
C++ is way over-rated but it's worth knowing because it's so widely used. Don't let it detract you from mastering C and learning scripting languages. Understanding object-oriented design is more important than knowing the latest trendy language.
Objective-C.
Just because software is Free/Open doesn't mean it's "cheap" and poor quality. I could have saved myself 2-3 years there.
Ignore Windows and it will ignore you.
Meaningfull messages. (Score:3, Insightful)
Nice.
The Truth (Score:5, Insightful)
A few rules of thumb for a startup environment:
1. Don't overengineer! Overengineering wastes time on things that may never be used. Features should be customer driven.
2. Functions and methods should be as small as possible. You should make it an obsession to split methods and functions into the smallest possible components. Only then can you have good code reuse. Don't start thinking I will split it when I need it, you never will!
3. Never ever reinvent the wheel. Reinventing things that exist is overengineering.
4. Don't optimize ahead of time. When I say that I don't mean don't use a hash table instead of an array where it makes sense. I mean don't try to avoid exception handling or function calls or other minor optimizations. If it has an impact on readability don't do it. Optimization always comes last. Often you'll find there are only 1 or 2 "hotspots" in your code. If you spend time optimizing these "hotspots" after your application is built thats when you'll get the best return on your investment. Another gotcha with optimization is using technologies that can't deliver the level of performance you expect. You should test to make sure the underlying components you plan to use will perform as expected before you start coding.
5. Don't cram as much code in a single statement as possible. Every compiler I know about today will produce identical code whether it's one statement or 5 statements. It makes it hard to read so don't do it!
6. Allocate time for testing. No one writes perfect code.You want to give a good impression to your customer so don't skip this step.
7. Make unit testing an obsession. Always add unit tests for new code, it reveals errors in your code. When you find a bug in your code add a unit test to test for it. If in the future someone decides to rewrite some function or method you wrote because it's not elegant enough they will not reintroduce old bugs.
8. Don't rewrite code if possible. Refectoring is almost always easier and less error prone.
I disagree with nearly every point... (Score:5, Interesting)
If you are writing a program that touches more than two persistent data stores, it is too complicated.
Others have already mentioned cases where multiple datastores make sense. A trivial example: One database to handle user data, another to handle blobs (image conversions, etc) -- bonus if the second store can do its own conversions; a third to handle logging -- that's already three, and that's before we start considering things like RESTful services, which can function as intelligent datastores of their own...
If Linux can do it, you shouldn't.
Unless you're not on Linux. And, specifically:
Don't do image processing work with PIL unless you have proven that command-line ImageMagick won't do the job.
If you're doing something that truly works as a shell script, and isn't part of a larger app, I agree. However, PIL likely performs better, and it removes the shell as an issue -- if you thought SQL injection was bad, wait till you have people exploiting your shell commands. You can do it safely, but why would you bother, when you've got libraries that accept Python (or Perl, or Ruby) native arguments, rather than forcing you to deal with commandline arguments? Why do you want to check return values, when you can have these native libraries throw exceptions?
Parallelize When You Have To, Not When You Want To
If you don't at least think about parallelization in the planning stage, it's going to be painful later on. It's easy to build a shared-nothing, stateless architecture and run it in a single-threaded way. It's hard to build a stateful web service with huge, heavyweight sessions, and then make it run on even two application servers in the future. Possible, but awkward, to say the least.
For example, if you are doing web crawling, and you have not saturated the pipe to the internet, then it is not worth your time to use more servers.
...unless, maybe, it's CPU-bound? And this is odd to mention in a section about parallelization -- wouldn't slow servers be a prime candidate for some sort of parallelization, even on a single machine, even if it's evented?
If you have a process running and you want it to be restarted automatically if it crashes, use Upstart.
Cool, but it looks like Upstart is becoming a Maslow's Hammer for this guy. Tools like Nagios, Monit, and God exist for a reason -- one such reason is knowing when and why your processes are dying even if they're spread across a cluster.
NoSQL is NotWorthIt
People who have read my other posts likely know where I stand on this, but...
Redis, even though it's an in-memory database, has a virtual memory feature, where you can cap the amount of RAM it uses and have it spill the data over to disk. So, I threw 75GB of data at it, giving it a healthy amount of physical memory to keep hot keys in...
So you found out an in-memory database wasn't suitable when you have far more data than physical memory? Great test, there.
Redis was an unknown quantity...
Maybe so, but that wasn't terribly hard to guess.
Yes, maybe things could have been different if I used Cassandra or MongoDB...
So maybe you should've benchmarked a NoSQL database which is actually designed to solve the problem you're trying to solve? Just a thought.
especially if something like PostgreSQL can do the same job.
If PostgreSQL could do the same job, the current generation of NoSQL databases wouldn't have been invented. Unless something's changed, PostgreSQL can't scale beyond a single machine for writes, unless you deliberately shard at the application layer, which would violate his rule about multiple datastores, wouldn't it?
It seems like the attitude is to no
Has ImageMagick improved? (Score:3, Insightful)
Don't do image processing work with PIL unless you have proven that command-line ImageMagick won't do the job.
I think the worst mistake I made as Mr. XEmacs was attempting to unify our graphics support to call ImageMagick libraries instead of the custom stuff we were using (and later restored when ImageMagick was backed out).
Does it work any better now? The last time I looked at display(1) a couple of years ago, it still wasn't close to long lost and patent challenged xv(1) that got shut down by the GIF patent war.
until you find out why Python doesn't do the job (Score:4, Informative)
Don't program in C if Python will do the job.
But in a lot of cases, Python does not do the job. It doesn't do the job on iOS because Apple has explicitly banned everything but Objective-C++. It doesn't do the job on Xbox 360 or Windows Phone 7 because IronPython uses Reflection.Emit, and the version of .NET used by XNA doesn't support Reflection.Emit. And it doesn't do the job on Nintendo DS because the runtime uses up a lot of the available 4 MB of RAM.
Re:until you find out why Python doesn't do the jo (Score:5, Insightful)
Look, the PHB usually wants the code to run on HIS server for HIS use only. That's what he pays you for. Not to code it in the most cross-platform friendly language-du-jour and take 2 years to iron out all the bugs.
He doesn't give a shit if it'll run on the Xbox 360 or a Linux-ready Dead Badger.
And neither should you, unless you are some kind of anal retentive who spends all day arguing the merits of absolute versus relative positioning, fixed vs percentage tables, and worrying whether your code will run on every machine conceived in the next 50 years.
I have news for you, it won't.
Hell, I got an N900 6 months ago, and it's already EOL'd as far as updates to the OS are concerned.
I suppose I could wait till there's a port of Android or something, because the three coders on the project are doing a fine job, in less than a year I'll have the same functionality as a 3210.
Re: (Score:3, Interesting)
So, where exactly did you get that information from and why do you think it is real?
Re: (Score:3, Informative)
Hell, I got an N900 6 months ago, and it's already EOL'd as far as updates to the OS are concerned.
I suppose I could wait till there's a port of Android or something, because the three coders on the project are doing a fine job, in less than a year I'll have the same functionality as a 3210.
Or you can read a little farther and see that upgrades are already available: http://en.wikipedia.org/wiki/MeeGo [wikipedia.org]
Re:Don't code in C (Score:4, Insightful)
No. Sorry, I strongly disagree.
C is like a swiss army knife that can be used to assemble a chainsaw, a sword a lightsaber, etc. You'll have to carve out all the pieces, file them down and put them together yourself, but the swiss army knife will help you a long the way.
But don't forget the bandaids.
Re:xargs mapreduce? (Score:4, Informative)
I guess I'm an idiot, but.. err... did someone REALLY use mapreduce to solve an argument passing problem (the domain of xargs), or is the writing just shit?
xargs and its successor GNU parallel [wikipedia.org] implement the "map" part of MapReduce.
Re: (Score:3, Insightful)
Sadly, if you're working in Java then you will frequently be forced to write the same 8 damn lines over 70 times, and there will be literally no way to avoid it unless you resort to automatic code generation, which is a fancy name for using a separate program to do the copying-and-pasting for you.
Re: (Score:3, Informative)
The code in question (9 lines actually) is:
Ironically, making your code less object oriented (ie. screw encapsulation) fixes this problem.
Re: (Score:3, Insightful)
I'm not, no. But some people have, and they may be reading this right now. How do you know you're not addressing the next Bill Gates?
Anyway, your advice is way off. The people with the most _social_ success are raging egotists and shameless self-promoters. There's right ways and wrong ways to do it, and the wrong ways lead to you being written off as a pompous ass -- but not being such will never lead to social success.