Sunday, December 15, 2013

Scientific Computing

Scientific stuff!


Scientific computing is development utilizing/creating/maintaining programs/suites which focus on data and scientific processes. These include fields such as bioinformatics, neuroscience, computational physics, numerical analysis, symbolic computation, computational chemistry, and computational biology. Like any development, the requirements are defined, designed, and implemented through processes such as AGILE, with constraints on accuracy and precision of data, where warranted. Sensibly, scientific programs may be as complex or simple as they need be; they can be a simple web form that generates properly formatted formulae via Latex, a mathematical scripting language, or through frameworks such as Mathematica (Wolfram-Alpha software) or Opera (one of many open-source programs, this one handling matrix calculation/transformations.)


Having taken Biology, Oceanography, and a number of heavy Physics courses, I am interested in facilitating and modeling real-life processes in code. It is not my focus, but I enjoy a good puzzle, and algorithmically representing non-digital concepts definitely fits the bill, considering the limitless potential complexity. Neuroscience seems to be all the rage today, in attempting to model human perception,expand artificial intelligence, and assimilate neuron-based storage/processing data for faster/more responsive computing. If we can model human consciousness, we can better understand ourselves. There is a lot of interest in creating a synthetic consciousness, but there are also many ethical and philosophical concerns there as well.

Bonus! More Science applications! (CLICK ME!)


Tuesday, December 10, 2013

Computer Graphics: Shiny Stuff

Display thyself, knave!


Shiny water, texture maps, bloom, sky-boxes, billboards, vectors, anti-aliasing, particle effects, cell shading, gradients, directional lighting, and many other terms describe aspects of modern-day graphics. With the ever increasing speed of rendering, libraries, and creation tools available, the graphics become more clean, crisp and plentiful in applications from the OS to applications. This extends across more and more devices from the PC and laptop, to smartphones and smart watches.


But graphics aren't just about games and interfaces, they also are a great way to visualize expensive or difficult real-world concepts such as quantum physics, or tangible arts. Engineering most anything can be done on the computer inexpensively before rendering them in real life. In fact, the graphic image above was used to create the correct angles and form of the following sculpture.

Three dimensional graphics can be rotated about axes by using matrix-based mathematical transformations. These transformations are called quaternions, and reference a point of rotation while supplying relative positional information per affected point on a rigid body. Spline, and other skeleton-simulating methods can be used to add in relationships between multiple rigid bodies, and while I'm fairly novice where it comes to these multi-part objects... they are interesting! Take a look at the orc and goblin's skin on Lord of the rings, or the fur on the Monsters, Inc. monsters. The detail is complex, amazing, refractive/reflective, flexible, and maintain different behaviors based on the type of fur/fuzz/hair.

Bonus! More Snazz (Old, but Relevant) on Graphics. (CLICK ME!)


Communications and Security

Defend thyself, knave!


With the current climate of information theft, transparency wars, and a war on rights versus monitoring, we really need to protect ourselves on the Internet.

This includes cloud networking, cellular information, and other wireless devices which can make us vulnerable to predation and inethical policies out there in the web world.

Communications security is important, and to address it intelligently is no simple task. There are guidelines to go by however:

  • Separate your public and private self, putting only what you want to be freely accessed out there.
  • Question before checking or sending emails, is this an address I identify with, and is it completely correct in spelling.
  • Use a firewall, whether software, or hardware. Configure it on the strong side, and add permissions as needed in order to get the things done you need to. Erring on the side of safety is a good start.
  • Turn off location services for photos so that your photos can't be traced to where you are, and minimize you and your friends'/kids'/etc's traceability.
  • Keep your information -yours-. Don't allow people to use your accounts for their purposes. Instead, set up guest profiles or alternate accounts to lend out as needed whether it be chat, forums such as Facebook, or other web-based services. Also, never give out passwords.
  • Email and chat are not very secure when it comes to sharing information such as passwords, identification, or authorizations. Be a bit paranoid when exchanging these. It is much better than gritting your teeth through reimbursement policies or even litigation regarding information theft.
  • Use trusted sites and services. Do your homework first if you have doubts. The better business bureau and security sites can save a lot of grief in the long run.
  • Use secure settings for public browsing and network protocols should be set to a more secure setting than when at home. This includes cellphones and other mobile devices.
  • Facebook, G+, Reddit, and other social media networks can be fun, but they can also be used against you. This restates protecting yourself and separating public from personal profiles, but the idea is imperative to maintaining security for your information and reputation.
  • A lot of people save passwords on their browsers and through third party add-ons. This is fine as long as you have and use a lock for the machine so you can log off when not around.


The bottom line, is that just like driving, you have to be smart about it in order to protect yourself and the people that depend on you. Your information is worth a lot, even if it is just email addresses. These can be sold to telemarketers, or stolen by hackers, who if they find a way to access them, can get passwords for your services, and sites including online banking. Not good. Bot good at all! So be safe, smart and do stuff so that people don't do stuff to you.

Bonus! More Snazz on COMSEC. (CLICK ME!)


Artificial Intelligence

Cyborgs, Androids and Skynet, oh my!


Artificial intelligence by definition is:

artificial intelligence noun

  • a branch of computer science dealing with the simulation of intelligent behavior in computers
  • the capability of a machine to imitate intelligent human behavior

More commonly, amongst those developing A.I., it is referred to as the study and design of intelligent agents, where intelligent agents are enclosed systems which perceive their environment, taking actions in order to increase/maximize chances at success. Success depends on a goal or ideal which the artificially intelligent 'being' is given as targets to provide a context against their current state/progress. Depending on the scope of the A.I. this can be specific or relatively theoretical, utilizing higher mathematical concepts to model uncertainties and erratic behaviors in tangible, mathematical ways. From here computers can consistently analyze and benchmark progress similar to human perceptions of progress in regards to life goals, or task-based intentions.


Artificial intelligence was founded based on human intelligence... rather, on the capacity to model it in a manner which machines may simulate. As such, there are many of mixed feelings about the concept, as machines may be aberrant in purpose depending on their design. This pessimism is typical as people do not trust new technologies and tend to be fairly skeptical, or downright violently superstitious with powers they do not understand. On the other hand, as technology becomes more familiar, dependencies form, and complacency/apathy grows. The field is interesting and versatile, and with the increase of power per unit of volume circuit boards have, we really have to take responsibility, and create things that better life for everyone, not just progress for the sake of progression.

Bonus! More Snazz on A.I.(CLICK ME!)


Friday, November 15, 2013

A Brief History on Computer Science | PART TWO: Attack of the Phones... internet service providers.


Stopping for a breath of air feels nice, doesn’t it? Too bad, harden up and learn (review?) something…

FORTRAN comes into play in 1957, and IBM is dominating the market, followed shortly by COBOL… and then after a blatant run on, in ’62 we see Stanford and Purdue establish some stuff. Departments…that ‘do stuff.’ What stuff? Computer Science!

The end. Okay then there’s this thing called the Internet. That’s relevant right? ARPANET came about in ’69, while Cray computers push the bar on processing speeds. Intel kicks in with small microprocessors; FTP is established; Alan Kay says hello from Stanford with his mouse and graphics (icons and more); Jobs and Wozniak build their Apple while Gates and Allen start Microsoft in 77. This brings us to modern day computing. 1977! No not because Microsoft had a party, rather, because I was born and ready to surf the web… which… okay never mind.

Sooo… in The eighties, we saw the PC’s come to be, Apple and IBM, as well as many computerized consoles, appliances, and later on laptops and smart devices. CD-ROMs became a great alternative to magnet-sensitive, and small file space diskettes. By the mid nineties, Programming became increasingly accessible and user-friendly. With dial-up internet, file sharing, and MUDS/MOOs, BBS’, and operating systems enabling accessibility and communication in an exponentially growing capacity.

I remember 800 baud modems. Logging in to Genie, Compuserve, or *cough* AOL. Everyone had much more patience then… or dealt with it as there was little alternative. Play by mail Dungeons and Dragons was still a thing. Well, it still is, but much less popular (the play by mail thing, not the game.)

I suppose I’ll leave this as a history lesson and not so much a current day lesson as I long-wind like some old man to his family on a cold night… We have science and computers, and computer science. We have fourth generation microprocessors with 32 and 64 bit operating systems running at many thousands of operations per second in tandem. It’s often healthy to take a step back and get some perspective on just how far we have come, in order to understand that we can always innovate. There is always more to learn, and in order to do that we study, experiment, and communicate. We are passionate, sleep deprived, perceptive, adaptive, and bright, Computer Scientists.

I cannot morally complete this post without mentioning some major things such as UNIX/Linux, C, ParcPlace and Java. I have some links to do some digging on these as they require appropriately long posts to discuss:

History of UNIX: K&R (CLICK ME!)

History of Linux: K&R (CLICK ME!)

History of C: K&R (CLICK ME!)

History of ParcPlace: K&R (CLICK ME!)

History of C: K&R (CLICK ME!)


A Brief History on Computer Science | PART ONE: Calcular Wars


I could herald past efforts and speak of the innovation involved in computers from the abacus to modern-day smartphones and laptops, but I won’t. Instead I’ll focus on things I find important to remember in reference to Computer Science, and modern day development. This does involve a lot of Mathematics and technology, so ignore my comment and bring on the abacus!

A bunch of smart people refined logic –people like Gottfried Leibnitz who paved the way to modeling logic in binary formats in the early 1700’s. A bit over one hundred years later Charles Babbage described the first computer ‘Analytical Engine’, and in 1854 George Boole created a structure of rules which used simple true-false statements into computational processes. Multi-threaded calculators become a reality; Ada Lovelace develops an algorithm, and predicts symbolic calculation will be possible in the future.

It blows my mind every time I think that less than two hundred years ago, we were just beginning to figure out how to design calculators, less, computers, and mobile devices. I’m old enough to have seen the rise of Apple and Microsoft, the browser and file sharing wars, and many of the developments impacting the languages and operating systems we use today.

Stuff happens and cybernetics and bugs are coined by magazine submissions and moths in Navy circuits, respectively. Important stuff, hey! How else would we get our Neuromancer and Electric Sheep? Next up, we go back in time to the nineteen-forties! Expanding on Konrad Zuse’s mechanical computers, John von Neumann, and his famous von Neumann Architecture is developed, featuring a reduced instruction set computing (RISC) architecture that uses 21 instructions to, “make it so.” These computers use memory, the ALU (arithmetic logic unit), and IPU (instruction processing unit) to work together and perform interpretation, assignment and manipulation of data using registers. This is a huge step, and led to computing as we know it. Memory got better, microprogramming (Wilkes) and compilers were built (Grace Murray Hooper), and transistors started being built smaller and smaller.

A Timeline of Computers (CLICK ME!)


Monday, November 11, 2013

File Sharing: Data For All

Some ways to transfer and store stuff for.. err... more great justice!


There are a lot of ways to store and stream, share and retrieve data these days. The diskettes and zip drive days are gone, as with the tape drives and punch cards before that. Moore's Law keeps us busy making smaller and more efficient storage for holding data. Fortunately, the data is also being handled better and faster. We've got huge RAM, thumb drives, massive cloud storage, Google Drive, DropBox, and fast solid state/SATA/HDD local storage! It's amazing!


More stuff to look into for the neophyte scientist, or casual browser... We've got the old web hosting via FTP (File Transfer Protocol,) REST API's sending formatted data to queries such as AJAX GET calls which asynchronously toss the data into tables, or create site elements or even pages from them. Drupal, SQL-based, mySql-based, UDBC/JDBC and other database frameworks similarly allow for back-end file-sharing beasts which efficiently store and format data, provide statistical analysis, or secure secret digits for secure and speedy handshakes. There are also Java Servelets, JSON stringified objects, BLOBS and a great deal of other entities that make our life easier in the background of our comfortable, intuitive UI's. Everyone shares files, so it's good to take a step back and look at the system to understand just how easy we have it!

Bonus! More Snazz on File Sharing.(CLICK ME!)