Portfolio vs. Resume
Portfolio versus Resume
A resume says nothing of a programmer’s ability. Every computer science major should build a portfolio. A portfolio could be as simple as a personal blog, with a post for each project or accomplishment. A better portfolio would include per-project pages, and publicly browsable code (hosted perhaps on github or Google code). Contributions to open source should be linked and documented. A code portfolio allows employers to directly judge ability. GPAs and resumes do not. You can check out my blog and github here as well as my YouTube channels (randerson112358 & compsci112358 ).
Programming languages rise and fall with the solar cycle. A programmer’s career should not. While it is important to teach languages relevant to employers, it is equally important that students learn how to teach themselves new languages. The best way to learn how to learn programming languages is to learn multiple programming languages and programming paradigms. The difficulty of learning the nth language is half the difficulty of the (n-1)th. Yet, to truly understand programming languages, one must implement one. Ideally, every computer science major would take a compilers class. At a minimum, every computer science major should implement an interpreter.
Students must have a solid grasp of formal logic and of proof. Proof by algebraic manipulation and by natural deduction engages the reasoning common to routine programming tasks. Proof by induction engages the reasoning used in the construction of recursive functions. Students must be fluent in formal mathematical notation, and in reasoning rigorously about the basic discrete structures: sets, tuples, sequences, functions and power sets.
If you want to read up on more induction problems or Discrete Math topics in general a great book to easily learn and practice these topics is Practice Problems in Discrete Mathematics by Bojana Obrenic’, and Discrete Math Workbook: Interactive Exercises by James R. bush.
Students should certainly see the common (or rare yet unreasonably effective) data structures and algorithms. But, more important than knowing a specific algorithm or data structure (which is usually easy enough to look up), students must understand how to design algorithms (e.g., greedy, dynamic strategies) and how to span the gap between an algorithm in the ideal and the nitty-gritty of its implementation.
A grasp of theory is a prerequisite to research in graduate school. Theory is invaluable when it provides hard boundaries on a problem (or when it provides a means of circumventing what initially appear to be hard boundaries). Computational complexity can legitimately claim to be one of the few truly predictive theories in all of computer “science.” A computer student must know where the boundaries of tractability and computability lie. To ignore these limits invites frustration in the best case, and failure in the worst.
There is no substitute for a solid understanding of computer architecture. Everyone should understand a computer from the transistors up. The understanding of architecture should encompass the standard levels of abstraction: transistors, gates, adders, muxes, flip flops, ALUs, control units, caches and RAM. An understanding of the GPU model of high-performance computing will be important for the foreseeable future.
Any sufficiently large program eventually becomes an operating system. As such, a person should be aware of how kernels handle system calls, paging, scheduling, context-switching, filesystems and internal resource management. A good understanding of operating systems is secondary only to an understanding of compilers and architecture for achieving performance. Understanding operating systems (which I would interpret liberally to include runtime systems) becomes especially important when programming an embedded system without one.
Given the ubiquity of networks, a person should have a firm understanding of the network stack and routing protocols within a network. The mechanics of building an efficient, reliable transmission protocol (like TCP) on top of an unreliable transmission protocol (like IP) should not be magic to a computer guy. It should be core knowledge. People must understand the trade-offs involved in protocol design — for example, when to choose TCP and when to choose UDP. (Programmers need to understand the larger social implications for congestion should they use UDP at large scales as well.)
The sad truth of security is that the majority of security vulnerabilities come from sloppy programming. The sadder truth is that many schools do a poor job of training programmers to secure their code. Developers must be aware of the means by which a program can be compromised. They need to develop a sense of defensive programming — a mind for thinking about how their own code might be attacked. Security is the kind of training that is best distributed throughout the entire curriculum: each discipline should warn students of its native vulnerabilities.
User experience design (UX)
The principles in software engineering change about as fast as the programming languages do. A good, hands-on course in the practice of team software construction provides a working knowledge of the pitfalls inherent in the endeavor. It’s been recommended by several readers that students break up into teams of three, with the role of leader rotating through three different projects. Learning how to attack and maneuver through a large existing codebase is a skill most programmers will have to master, and it’s one best learned in school instead of on the job.
If for no other reason than its outsized impact on the early history of computing, student should study artificial intelligence. While the original dream of intelligent machines seems far off, artificial intelligence spurred a number of practical fields, such as machine learning (I really like machine learning), data mining and natural language processing.
Databases are too common and too useful to ignore. It’s useful to understand the fundamental data structures and algorithms that power a database engine, since programmers often enough reimplement a database system within a larger software system. Relational algebra and relational calculus stand out as exceptional success stories in sub-Turing models of computation. Unlike UML modeling, ER modeling seems to be a reasonable mechanism for visualing encoding the design of and constraints upon a software artifact.
Thanks for reading this article I hope its helpful to you all ! Keep up the learning, and if you would like more computer science, programming and algorithm analysis videos please visit and subscribe to my YouTube channels (randerson112358 & compsci112358 )
Check Out the following for content / videos on Computer Science, Algorithm Analysis, Programming and Logic:
Video Tutorials on Recurrence Relation:
Video Tutorial on Algorithm Analysis: