Jeff's MCAD Blogging
Jeffrey Rowe has almost 40 years of experience in all aspects of industrial design, mechanical engineering, and manufacturing. On the publishing side, he has written well over 1,000 articles for CAD, CAM, CAE, and other technical publications, as well as consulting in many capacities in the design … More »
March 24th, 2016 by Jeff Rowe
Earlier this week many of us in the MCAD community were saddened to hear of the passing of Andrew (Andy) Grove, the former CEO and Chairman of Intel Corp. He was one of the most acclaimed and influential personalities of the computer and Internet eras, as well as being instrumental in the development and proliferation of the CAD software as we know it today that runs on PCs.
Born András Gróf in Budapest, Hungary in 1936, Mr. Grove came to the United States in 1956. He studied chemical engineering at the City College of New York, completing his Ph.D at the University of California at Berkeley in 1963. After graduation, he was hired by Gordon Moore (of Moore’s Law fame) at Fairchild Semiconductor as a researcher and rose to assistant head of R&D under Moore. When Robert Noyce and Moore left Fairchild to found Intel in 1968, Mr. Grove was their first hire.
He became Intel’s President in 1979 and CEO in 1987, and served as Chairman of the Board from 1997 to 2005. During his time at Intel and in retirement, Grove was a very influential figure in technology and business, and several business leaders, including Apple’s Steve Jobs, sought his advice.
Mr. Grove played a critical role in the decision to move Intel’s focus from memory chips to microprocessors and led the firm’s move as a recognized consumer brand. Under his leadership, Intel produced the chips, including the 386 and Pentium, that helped foster the PC era. The company also increased annual revenues from $1.9 billion to more than $26 billion.
Just as we could have rode into the sunset, along came the Internet, and it tripled the significance of the PC.
March 17th, 2016 by Jeff Rowe
Like many of the ingredients in a manufacturing organization’s computer technology alphabet soup, such as ERP, SCM, CRM, not to mention CAD, CAM, and CAE, product lifecycle management (PLM) for years has been touted as being the final frontier for integrating all manufacturing IT functions. Honestly, though, can it truly provide all that the various vendors are promising? I have asked myself that question for several years now: Is PLM a great hope or just another great and continuing hype?
It seems that every vendor defines PLM in a manner that best suits their respective existing product lines and business practices, and not always necessarily the processes of the customers they are trying to serve. Therein lies a big part of the PLM problem. PLM should address processes and not just products, especially the vendors’. Too few vendors still stress the processes they are claiming to improve over the products (and perpetual services) they are selling.
It also seems like everybody (yes, now including just about every CAD vendor big and small) has at least tried to get into the PLM act, regardless of whether they should or should not based on their development and integration capabilities or the needs of their customers. Even database giant, Oracle, has said for years that it wants to be a major PLM player, although the company has eluded that it doesn’t want to dirty its hands with traditional CAD/CAM stuff. Oracle wants to look at the bigger picture, although it has never elaborated on what that picture is.
March 10th, 2016 by Jeff Rowe
In a major move last week, Autodesk and Siemens announced an interoperability agreement aimed at helping manufacturers decrease the huge costs associated with incompatibility among product development software applications and avoid potential data integrity problems. Through this agreement, Autodesk and Siemens’ product lifecycle management (PLM) software business will take steps to improve the interoperability between their companies’ respective software offerings. The agreement brings together two CAD heavy hitters with the common goal of streamlining data sharing and reducing costs in organizations with multi-CAD environments (and these days, who doesn’t have a multi-CAD environment?).
The interoperability agreement aims to decrease the overall effort and costs commonly associated with supporting these environments. In particular, the companies are hoping that interoperability between the offerings from Siemens and Autodesk will significantly improve the many situations where a combination of each other’s software is used. Under the terms of the agreement, both companies will share toolkit technology and exchange end-user software applications to build and market interoperable products.
“Interoperability is a major challenge for customers across the manufacturing industry, and Autodesk has been working diligently to create an increasingly open environment throughout our technology platforms,” said Lisa Campbell, vice president of Manufacturing Strategy and Marketing at Autodesk. “We understand that our customers use a mix of products in their workflow and providing them with the flexibility they need to get their jobs done is our top priority.”
“Incompatibility among various CAD systems has been an ongoing issue that adversely affects manufacturers worldwide and can add to the cost of products from cars and airplanes to smart phones and golf clubs,” said Dr. Stefan Jockusch, Vice President, Strategy, Siemens PLM Software. “Siemens has been at the forefront in helping to resolve this incompatibility issue with a wide variety of open software offerings that significantly enhance interoperability. This partnership is another positive and important step in our drive to promote openness and interoperability and to help reduce costs for the global manufacturing industry by facilitating collaboration throughout their extended enterprises.”
March 3rd, 2016 by Jeff Rowe
For as long as I can remember, cloud storage and computing have offered only one thing – endless promises and perpetual growth. For a while that was true, but some things have happened in the past couple of years that temper those claims and may portend what may happen in the future for technology providers that become increasingly reliant on the cloud – layoffs.
Cloud computing, or internet-based computing provides shared processing resources and data to computers and other devices on demand. From the beginning it was intended as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort.
Proponents have always claimed that cloud computing allows companies to avoid upfront infrastructure costs, and focus on projects that differentiate their businesses instead of on infrastructure. Proponents have also claimed that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand. Cloud providers typically use a “pay as you go” model. This can lead to unexpectedly high charges if administrators do not adapt to the so-called cloud pricing model.
To a large extent most of these claims have proven true, and I have been a proponent for many aspects of cloud computing, but there is also a downside – generally, you just don’t need as many people to run and maintain a cloud-based organization.
The downside is that you will have limited customization options. Cloud computing is cheaper because of economics of scale, and like any outsourced task, you tend to get what you get. A restaurant with a limited menu is cheaper than a personal chef who can cook anything you want. Fewer options at a much cheaper price: it’s a feature, not a bug and the cloud provider might not meet your legal needs. As a business, you need to weigh the benefits against the risks.
February 25th, 2016 by Jeff Rowe
Last week, in Part 1, I ended the blog by saying that if you can’t fix something, you don’t own it. I still stand by that statement. This week will continue the discussion for those of us who want some control over the devices we own and use and not the vice versa.
In the article, he says, “Imagine if Ford remotely disabled the engine on your new F-150 pickup because you chose to have the door locks fixed at a corner garage rather than a dealership. Sound absurd? Not if you’re Apple.
February 18th, 2016 by Jeff Rowe
You don’t truly own something that you can’t get into to modify or repair.
I’ve got an iPhone 4S that’s a few years old and I still love it. I like the size, the feel, and I’ve purchased a number of accessories designed specifically for it. I’ve also rescued it from dropping it in water, and know how to replace the battery, as well as the glass back and the front screen. These self-repairs are officially no-no’s according to Apple, and aren’t easy, but knowing how to repair the phone I still really like and keeping it 100% functional, I intend to hold onto it until something happens that I can’t resolve, such as a surface mount component failure.
I’m probably not like a lot of consumers, so I don’t necessarily constantly need the latest and greatest. I’d rather maintain and repair what I have as long as I can. After all, I view my phone, cameras, and computers as tools that should be made to last, and not precious possessions on the one hand, or mere throwaway items on the other.
My journey to fixing my own stuff started a number of years ago with an excellent resource call iFixit – a free online series of repair manual for tinkering with thousands of products. The goal of iFixit was to teach virtually anyone how to fix the stuff they own — ranging from laptops to snowboards to toys to cell phones. In other words, iFixit is part of a global network of “fixers” trying to make the stuff they own last as long as possible.
Makers put things together; fixers take them apart and rebuild them. Tinkerers are a little bit of both, and are much more than just consumers — they are participants in the things we make, own, and fix.
This might sound great, but over the years, I have found that this participation — tinkering with products made by others — puts both makers and fixers at odds with manufacturers.
February 11th, 2016 by Jeff Rowe
It’s already mid February, and with two months well on their way to being history this year, it’s not too late to tell our readers about what we’ll be covering for the remainder of 2016. The MCADCafe editorial calendar below reflects what we perceive as some of the most important topics today, as well as feedback from our readers and other supporters with what they feel is important and relevant.
The main theme for each month will be covered in an extended article or series of articles so that the topic can be covered more comprehensively.
We’ll also be covering some of the major MCAD events throughout the year, reporting what we see and hear from vendors, partners, and attendees. All of the events we attend will include daily written coverage and Tweets throughout event days, as well as video and audio interviews.
If there is anything we missed or if you have any thoughts of topics or events you would like to see covered in 2016, feel free to contact me directly at email@example.com or 719.221.1867. I’m always open to suggestions and new ideas!
We look forward to an exciting 2016 and providing you with the MCAD content you want most for improving your design, engineering, and manufacturing processes and top and bottom lines.
Keep MCADCafe.com your source for all things MCAD because 2016 promises to be a great year!
February 4th, 2016 by Jeff Rowe
Being the editor of MCADCafe, I am constantly on the lookout for innovative software and hardware products that make working life better for designers and engineers. While some of these products are truly unique, many are retreads and “me too’s” of existing offerings.
Lately, I’ve been especially watchful on the hardware platform front, because it doesn’t seem as compelling as it once was, much to the credit of escalating cloud-based hardware and software services.
However, something really caught my eye last year – the HP Sprout – a computing platform that is truly unique because it is a desktop computer but is also has an integrated 3D scanner for 3D object capture and editing as well as 3D print options.
In a nutshell, the Sprout is a relatively high-end Windows 8 computer with a novel two-screen configuration and advanced cameras, which combined can make some creative activities possible. The second display, on a desktop touch sensitive mat, is a major advance in the physical user interface for computers.
January 28th, 2016 by Jeff Rowe
This week Siemens announced that it was hitching a new car to its acquisition train: CD-adapco. With a purchase price $970 million, CD-adapco is a global engineering simulation company with software that covers a wide range of engineering disciplines including fluid dynamics, solid mechanics, heat transfer, particle dynamics, reactant flow, electrochemistry, and acoustics. It is probably best known for its combustion engine simulation capabilities.
Established in 1980 and still controlled by its founders, the company has about 900 employees and approximately $200 million in annual revenue and an annual growth rate of 15 percent for the past five years, according to its website. Its main competitor in engine simulation software is Ansys.
January 21st, 2016 by Jeff Rowe
A lot has been debated and written about America’s general decline in science, engineering, and technology, largely blamed on the first steps of our youngest citizens – education.
The new year gives me pause to reflect on what the New Year really means. Yes, it is the beginning of a new calendar year, but it is also the beginning of the second half of the school year for elementary, middle school, and high school students. The school year is especially important to me right now as I have begun my second half as a math teacher for the current school year.
The second half of the school year provides me the opportunity to reflect on what I learned during the first half of the year and apply it to be a more effective educator during the second half.