Thursday, November 22, 2012

#Vault tips: recap on vault managed revision tables (#FlowGeneration)

« Tip Tuesday! Housekeeping in PLM 360 | Main

11/21/2012

The Last Word on Tables (WikiHelp and Revision Tables)

Over the last week or so we have covered some of the more advanced approaches to managing your revision tables in Vault including style and manual updates.  For a more complete overview of the revision table concepts and some revision on these advanced settings check the general usage page here and the advanced table configuration here.  Revtable

- Allan

Comments

Feed

You can follow this conversation by subscribing to the comment feed for this post.

Verify your Comment

Previewing your Comment

Posted by:  | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

This weblog only allows comments from registered TypeKey users. To comment, please enable JavaScript so you can sign into TypeKey.

Vault FW/Coll/Pro users can brush up on their revision table management with these links to the complete overview of how to configure and lay out these tables.

Tuesday, November 20, 2012

FW: Tip Tuesday! Housekeeping in #AutodeskPLM360

273236663_9f83ee9a28

Big holiday week for most of us on this side of the pond – I know I have a lot of housekeeping to do before the big feast on Thursday and thought I’d share a tip for keeping your workflow tidy in PLM 360.

It’s really a simple thing, but if you’re a PLM admin, do your end-users a huge favor and put some notes in your workflows.  This is key especially when you have precondition scripts set. 

wfl_notes_map

It might take a little extra time to drag states and transitions around to make some real estate for the notes, but it’s worth it.  Try to position your notes off to the sides and keep them simple, but informative. 

I guarantee this will prevent a few ‘how come I can’t transition this workflow’ questions if the user can read the note and realize they can’t if they aren’t the record owner (if you have a ‘GenIsOwner’ script set as a precondition) or ‘why can’t I submit this audit’ if they aren’t listed as an auditor in the Audit workspace (where there’s a precondition script to only allow this transition to the lead auditor or someone noted on the auditor list).

We’ll be talking more about workflow in tomorrow’s PLM Talk or you can visit the Wiki for our Workflow Guide.  ^MS

Photo: chatiryworld

Don't make users ask why! Add notes to workflows in #AutodeskPLM360 to add clarity to workflow transition states & conditions. Looking forward to the PLM talk today on workflows

Reading: 5 NOs to make PLM usable (#FlowGeneration)

User experience. You can hear this combination of words quite often these days. PLM space is not an exclusion from that. People in enterprise software, engineering and manufacturing are starting to ask more questions about usability. There are multiple reasons for that. Think about end users – engineers, project managers, etc. Consumer technologies makes a significant influence on their perception about how future software needs to look and feel. In their home life, they are exposed to so many technologies. Many of these technologies are more powerful and more usable compared to products, company IT provides them. BYOD is only one example of consumer technology impact. Speaking about companies (opposite to individuals), I can see also see a significant interest to usability and user experience. On recent PLM Innovation conference in Atlanta, many companies spoke about the importance of usability in the future of PLM products.

To deliver usability is a complex task. However, I decided to make an attempt to define “five NOs” you need think about to make PLM software usability. These NOs probably can be applied to other enterprise software too. However, I think, for PDM/PLM industry they make the most sense.

1. No memorizing things. We are overloaded with the information. In the past, our best user experience was “file explorer” or “project browser”. Hierarchical view was the best UI pattern. It was everywhere, and PDM/PLM software actively mimicked that. Now, Google revolutionized this behavior. We don’t need to “browse for things”, but we can “search for thing”. It removes the need to memorize everything and make behavior much simpler.

2. No user interface inconsistencies. It takes long time to develop PDM/PLM products. Many companies also spent months and years to implement it and use it for production. As a result of that, we can see many inconsistencies in products and modules developed during different periods of time. By removing these inconsistencies, we can make experience much easier and pleasant.

3. No strange terminology. Enterprise software is well known for TLAs. It is everywhere. In addition to that, enterprise organizations are well know for creating lots of abbreviations and assumption about how to call different things – ECO, MBOM, EBOM, QBOM, SCM, CCB… this is a very short and incomplete list of terminology used by the software and companies. While software vendors cannot change the way a customer works, they clearly can make it easier and simple on their side. So, an attempt to eliminate abbreviation and inconsistent terminology can improve experience as well.

4. No gaps in user activity flow. User activity is important. Customer (especially when it comes to an individual worker) is very sensitive to the ability to get a job done. From that standpoint, what is needed is to make software to support process flow to go smooth. Don’t expect customer to be pleased with the need to jump over the screens, making strange manipulation with files (eg. copy, save, open) and assume end user will understand how to “make a sync” in order to transfer data between multiple systems. What the system needs to assume is that end user will forget, misunderstood and ask many questions if it will not go as he expects.

5. No duplication of office and other software and tools. PLM has love and hate relationships with office and email systems. The complimentary between these systems is obvious. PDM/PLM needs to rely on Office and email systems that have wide spread in organizations and huge mainstream adoption. So, integration with these tools is no-brainer decision for PDM/PLM functionality. At the same time, the same tools (Emails, Excel and content management systems) replaced PLM tools for collaboration and other forms of communication in the organization. I believe people are very comfortable with email and office systems. So, not to replace them can make user experience much better.

What is my conclusion? People are paying attention to user experience. Bad UI is not a joke anymore. When Boeing is paying attention on usability and SAP is investing into gamification of their software, PDM/PLM vendors need to think twice about their priorities. Just my opinion. YMMV.

Best, Oleg

image credit http://www.semantico.com/

Share

Tagged as: Customers, PLM, Technology, Usability

UX combined with uninterupted workflow. #OlegShilovitsky points out 5 nogo's for usability in PLM/PDM solutions. Thinking hard about a solution should be reserved for pure flow moments, the work that has to be done to work out that solution should be as smooth as possible. UX is being spearheaded by the mobile device manufacturers and employees demand equal UX/usability from the company software. Hence the high number of people who BYOD.

Monday, November 19, 2012

#Inventor #iLogic tip: Update automatically drawing properties from model iProperties

You have copied some model iProperties in the drawing file of the model, following for instance the steps described in this article.


Of course you want to keep the model and drawing properties in sync, but you want to do that automatically, without having to remember to hit the “Update Copied Properties” button in the drawing.


In order to do that, you need to follow the steps below in the drawing.

  • Menu Manage > iLogic > Add Rule.

Add Rule

  • Create a rule using the Script you can download here.

Rule

  • Menu Manage > iLogic > Event Triggers.

Event Triggers

  • Set the rule to run on the After Open Document and/or Before Save Document events or other events you may find more appropriate.

Triggers

If you want, you can apply the steps above in a new drawing and save it as template.


Ale

Nice post from the beinginventive blog with a great auto-synch for part/assy iproperties that are being used in the drawing file.

#AutodeskPLM360 FW: Feature Friday: Dont Go Cross-Eyed Figuring Out Notifications in PLM 360

305589_10200092544944343_713906362_n

One benefit to any PLM system is automatically notifying folks when they have work to do.  Today I want to go over a few key settings an administrator can configure when creating workflows.  While this is complex, once you get the hang of it, your eyes will uncross – promise!  On each workflow transition, you can set different ways to notify users, either via e-mail or as a to-do in the Outstanding Work on your dashboard.

wfl_trans1

Let’s walk through what each one means.  We’ll start in the middle since that’s easiest to grasp IMHO.  When checked, you’ll see a link to the record in your Outstanding Work when you have the ability to perform the transition (in this case, when the record is in state ‘Awaiting Team Review’).

dash

The first one in the highlighted section, ‘Notify by e-mail on occurrence’, is used along with the third setting ‘Notify users who have permission to perform’.  It’s a forward looking notification for those users who are cable of performing the following transitions.  I put some notes into the workflow to try to show this a bit better – we’ll focus on just part of the workflow map.

wfl_notes

The notes point out the relevant settings.  Let’s look at how we configured the 2 transitions coming from the state we’re in – the important bit for our example is the fact we set the ‘Notify users who have permissions’ to true so that when that transition becomes available we look back to see if ‘Notify by e-mail on occurrence’ was set true in the preceding transition:

implement

clarif

Here’s the e-mail a user (in this case, Tabby from last week’s post) gets when the transition ‘Start Phase 1’ occurs and they have permissions for both transitions coming from state “[02] Phase 1 Review”:

wfl_email

To help wrap my head around this, I like to first flesh out the workflow – don’t worry too much about the settings.  Then walk through state by state and think about when you want the notifications to occur.  If I was sitting at the state “Phase 1 Review” and knew I’d want folks notified when the item reaches that state, I’d go and make sure the transition leading into that state has the ‘Notify by e-mail on occurrence’ set and at least one of the transitions coming out of the state has the ‘Notify users who have permission to perform’ set (keep in mind you may not want notifications going out to all users, here’s where having different workflow permissions is useful).  Perhaps I’ll go into more detail on that sort of configuration in another blog as well as limiting who gets e-mail via scripting (have to leave you guys wanting to come back for more!).

Still clear as mud on how this whole notification thing works?  Let’s do one more quick example.  Here we’ll explore what happens if we don’t have the ‘Notify by e-mail’ set.  I tweaked the settings in the ‘Clarification Needed’ and ‘Start Phase 1’ to include that, but notice our ‘Initiate’ transition doesn’t have it set.

clarif2

startP1

initiate

So when a record is initially created, ‘Initiate’ automatically happens, but no email goes out since ‘Notify by e-mail’ is not set. This is useful in workflows where the user creating the item will most likely be the one kicking off the workflow.  Now let’s say the workflow moves along, and once we’ve gotten to “[02] Phase 1 Review”, the step ‘Clarification Needed’ is performed sending the state back to “[01] Awaiting Team Review”.  Now the transition ‘Start Phase 1’ is available again and this time the notification *is* sent since ‘Clarification Need’ is configured to send the notification e-mail and ‘Start Phase 1’ has ‘notify users who have permission’ set.

We’ll be covering Workflow Process in next week’s PLM Talk if you’re interested in learning more (follow the link to register) and you can read more about workflows in the Workflow Guide on the Wiki.

Photo needs no crediting this week as it’s of my kitty friend Augustus Steve McQueen of Austin, TX (but I get to call him Augie) ^MS

Recently had a chat with a good friend about the power of workflow notifications for projectmanagers @ gates and project critical checkpoints. Nice post laying out the basics.

Friday, November 16, 2012

Anticipating: the 4-hour chef by #TimFerriss

If you have missed Tim's media attention campaign for his upcoming book; a recap here http://fourhourchef.com/.

If you are interested in why major publishing firms are boycotting the launch and what can be done/Tim is doing to circumvent that, please go here: http://www.fourhourworkweek.com/blog/2012/11/15/the-4-hour-chef-all-you-can-eat-campaign-of-goodness/

Personally, I ordered a few extra copies as presents.

Can't wait to see/read the next step in the Tim Ferriss cycle and start implementing some overdue lifehacks/meta learning techniques.

Notable mention for #hopshopgo for outside of U.S. customers. Lovely service.

 

 

Thursday, November 15, 2012

#Autodesk #Vault 2013 feature use case: file compression during transfer (#FlowGeneration)

« Tip Tuesday! Bee Careful with Spelling in PLM 360 Scripts | Main

11/14/2012

Squeezing Your Files (Compression on transfer)

2092253074_04e61bf095

One of the new features available in the Vault 2013 products was file compression during transfer.

Compression is controlled by a server side configuration file setting and is enabled by default for server to server file replication which permits faster, more efficient data transfer.  This feature can however, also be leveraged for file transfer between server to client, that is, compress the files as you get them from (check out) or send them to (check in) Vault.

Now, before you run off and start compressing your downloads, you should be aware that this is really only designed for users with a poor server connection, WAN or perhaps remote access.  Having your file compressed, transferred and then uncompressed does not really speed anything up if you are sitting right next to the server with a gigabit connection.

If you do however fall into the category of having a poor server connection and you want to learn more, head off to our WikiHelp article here and test for yourself.

- Allan

Photo: Elsie esq.

Comments

Feed

You can follow this conversation by subscribing to the comment feed for this post.

Brian Schanen

Hi Klaas,

This will work with Vault Basic, as I suggest this is a server side modification so we need to be careful that this does not adversely affect other users connected locally to the server but it certainly may be worthwhile testing in your situation to see if there are significant performance gains.

Cheers,
Allan

Posted by: Brian Schanen | 11/15/2012 at 12:12

Klaas De Smedt

Dear Allan,

Since Wednesday, I moved to another office and now I work with vault over WAN connection. I wondered if this works for the Basic Vault?

Kind regards,
Klaas

Posted by: Klaas De Smedt | 11/15/2012 at 07:58

Verify your Comment

Previewing your Comment

Posted by:  | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

This weblog only allows comments from registered TypeKey users. To comment, please enable JavaScript so you can sign into TypeKey.

Handy feature in Vault 2013 for field engineers with a bad/slow network connection to "squeeze" some more speed out of the vault interaction.

Tuesday, November 13, 2012

Reading: Avoiding the false proxy trap (#FlowGeneration)

Sometimes, we can't measure what we need, so we invent a proxy, something that's much easier to measure and stands in as an approximation.

TV advertisers, for example, could never tell which viewers would be impacted by an ad, so instead, they measured how many people saw it. Or a model might not be able to measure beauty, but a bathroom scale was a handy stand in.

A business person might choose cash in the bank as a measure of his success at his craft, and a book publisher, unable to easily figure out if the right people are engaging with a book, might rely instead on a rank on a single bestseller list. One last example: the non-profit that uses money raised as a proxy for difference made.

You've already guessed the problem. Once you find the simple proxy and decide to make it go up, there are lots of available tactics that have nothing at all to do with improving the very thing you set out to achieve in the first place. When we fall in love with a proxy, we spend our time improving the proxy instead of focusing on our original (more important) goal instead.

Gaming the system is never the goal. The goal is the goal.

Are we fooling ourselves? A nice wakeup call from #SethGodin.

#Autodesk #Vault cleanup: Find Orphaned Files

While I’m busy getting my AU classes in order, Jan Liska kindly provided this month’s sample app.  He is also the author of the popular Drawing Compare app.  Thanks Jan!

Find Orphaned Files is a quick and easy way to detect files that are not referenced by any other file.  In other words, files with no files listed in the “Where Used” tab.  To launch the app, just select a folder in Vault, right-click and choose the Find Orphaned Files command.

Requirements:
Vault Workgroup/Collaboration/Professional 2013

Click here to download the application
Click here to download the source code

As with all the samples on this site, the legal disclaimer applies.

Nice feature in the fw/coll/pro versions of Vault for tidying up the unreferenced files.

Wednesday, November 7, 2012

Reading: A More Effective Board of Directors

The definition of board effectiveness has shifted dramatically over the past decade. In the aftermath of the global financial crisis and numerous corporate scandals, a director now confronts not only complex oversight accountability, but also personal risk and liability. Clearly, this is a job not for the faint of heart.

As the supply of courageous board candidates dwindles, global companies are in need of battle-tested directors more than ever — board members who fully understand and can actively engage in virtually all aspects of an enterprise's operations. To be truly effective, a board needs directors who can work as a group to clearly define their role and mission, and in specialized individual roles, such as succession planning, acquisitions and capital allocation.

In this context, it's become rather easy today to identify the weakest boards. Typically, such boards comprise directors who act distant and detached — traits anathema to a business environment that demands transparency and accountability.

My colleagues and I recently studied what makes some boards more effective than others. We found that boards tend to progress from good-to-great along a four-phase continuum: 1) foundational, 2) developed, 3) advanced, and 4) strategic. Essential to creating a high-performance board is agreement and alignment, at the outset, on where the board actually stands in this continuum and where it needs to be.

The continuum essentially represents a corporate hierarchy of needs, akin to the famous personal-development hierarchy created by psychologist Abraham Maslow. In the corporate model, you equate a "foundational board," which provides basic compliance oversight, to basic survival needs such as food and shelter in the human hierarchy. Similarly, a "strategic board," which provides prescient forward-looking insights to form a company's foundational strategy, is fully actualized and high-performing.

Foundational — survival — boards focus on compliance; they play it safe. These are the weak performers in the corporate food chain, with directors who are unwilling to take strong positions, make tough decisions, or play proactive operational roles. Strategic — actualized, in Maslow's terms — boards underpin high-performance companies, where directors take appropriate risk to make significant contributions and lasting impact on enterprise value.

So how can weak boards advance along the effectiveness continuum if they find themselves clinging to survival basics? In our study, we found five elements — "disrupters" — that tend to hinder the progression of boards toward self-actualization and high performance:

  • Lack of clarity on the roles of individual directors and the board as a whole. Role ambiguity slows decision-making and causes unnecessary director conflicts.
  • Poor process management hinders effective board preparation, meeting management, and communications. This results in indecisiveness and a lack of urgency on critical challenges facing the organization.
  • Lack of alignment and agreement on company strategy causes disinterest among board members, who then simply default to tackling regulatory and compliance issues. Poor strategic alignment also hampers a board's ability to prioritize issues and set their near-term agendas. This often causes board disruption and sends damaging signals to financial markets.
  • Poor team dynamics fracture boards and lead to power struggles. Like any effective working group, a board should be comprised of professional peers who respect and work well with each other.
  • Board composition is a serious impediment, if not done right. Today's challenges require new perspectives and skills. But boards often lack the ability to objectively evaluate their makeup to determine if they have the right people and skills at the table.

I've seen my fair share of effective boards and dysfunctional ones. The worst cases nearly always exhibit at least one of the disruptors described above.

Classic dysfunctional examples include organizations where the company founder dominates board discussions and stifles all attempts to change and modernize the company or alter the composition of the board (i.e., poor team dynamics). In other cases, highly compensated boards literally run a company into the ground by churning through CEO after CEO (lack of strategic alignment). Other weak-performing boards focus on recruiting "big-name" directors — typically high-profile CEOs — who are simply too distracted by operational and financial issues facing their own companies to make any significant contribution (poor board composition).

In stark contrast, I've worked with board chairs who had the foresight and courage to spin off a successful division to help that now-standalone unit focus its resources on building its brand and market presence. In these instances, short-term personal gains were cast aside in favor of the long-term viability and health of the division — and the corporate entity.

The board of an international restaurant chain, for instance, played a leading role in reducing the company's overall risk profile. Specifically, a director personally spearheaded the development and adoption of an advanced enterprise resource planning system, working hand-in-hand with internal staff. Another high-performing board immersed itself into a global financial services firm's complex financing activities, as it successfully navigated a financial crisis. These directors went well beyond basic compliance to provide true strategic counsel.

To add such strategic value, high-performing boards must be "talent-centric." At its most basic level, this manifests itself in a board's composition and diversity level. An enterprise must attract directors who can provide valuable, strategic input, while building a board that can draw on the diversity of its members' expertise and backgrounds — across geographies, gender, race, and experience — to create a whole that's literally greater than the sum of its parts.

Strategic directors also commit to performing at their full potential and have the courage and self-confidence to raise and address any personal developmental needs. They also must be able to give constructive feedback to other directors to enhance the personal effectiveness of their board colleagues. A number of talent-development tools are available to help, including individual director and board assessments that gauge learning agility (the ability to learn from past experience and manage amid uncertainty) and other valuable traits and skills.

Effective corporate governance is more complex and challenging than ever. Companies need boards to help them meet regulatory compliance basics. But the most effective boards are those that easily check that box, while also delivering solid strategic counsel and direction. Recruiting and developing directors who go well beyond basic needs is the secret to building a high-performing, fully actualized board.

Gripping piece on #HBR describing what is wrong with many board of directors today and how to change it. Interesting pieces I found where talent management to select the best team, poor process management that creates problems and the thought I had on how new startups are getting their strategic course planned by angel investors/experts from the outside providing their chips.

Reading: Can Bigger Be Faster?

In nature, there's a tradeoff between size and speed. Whales are slow. Birds are fast. But organizations today need to be big and fast. Is it possible? Can organizations be both agile and scalable?

There's some good news. Science is revealing that biology doesn't have to rule the marketplace. And new models of leadership are emerging from some unlikely places.

First, the science. Professor Geoffrey West from the Sante Fe Institute has shown that in biology, bigger does have its advantages. Whales are more efficient and live longer than birds. But they are also slower and less adaptive. Economies of scale give efficiency, but not speed or resilience.

Cities, by contrast, get better and faster as they get bigger. Large cities have higher income, lower crime rates, and more rapid innovation. People even walk faster in bigger cities.

The reason is networks. Bio-mechanical systems get more efficient as they get bigger, but they also slow down and become less adaptive. Networks, on the other hand, become more versatile and creative. The brain has this characteristic. So do social systems like cities and communities. And virtual communities like Facebook or Twitter.

But what about organizations? Are they more like cities or whales? Communities or machines? In his research, West found that companies today behave more like whales and machines. The pursuit of economies of scale has led to efficiencies, but also a loss of speed and agility.

The good news is that there's no reason companies can't be more like communities. After all, companies are social networks too. It's just that we haven't been running them that way.

To get bigger and faster, organizations need to be reconceived as networks. But how? The appeal of the status quo is overwhelming for many. The hierarchical models of the 20th Century are safe, dependable, and comfortable for leaders and investors alike. Networks sound unpredictable — good for creating social groups, but bad for large organizations that need to make disciplined decisions.

Outside of startups and tech firms in Silicon Valley, are there any role models to emulate?

One answer comes from an unlikely place: the U.S military. Perhaps the most whale-like organization in the world. There is no greater hierarchy in the world than within the five sides of the Pentagon. Yet inside this massive structure is a surprising amount of innovation in the area of organizational design and decision-making.

As we have written previously, the events of 9/11 led the U.S. military to realize that "it takes a network to defeat a network." The new enemy was a light, agile, and rapidly evolving network. The hierarchical models of post-cold war design were no longer sufficient. Our military was big, and now it had to be fast.

The thought-leaders of this change within the military reconceived the organizational relationships as network-based, versus the traditional hierarchies of the past. They developed a new model that enabled the military to use its size — and its extended network of relationships — as an advantage rather than an impediment.

Four strategies were at the core of this transformation: build relationships, establish shared purpose, create shared consciousness, and foster diversity.

(1) Build relationships
In network terms, relationships are connections between nodes. When viewed as a network, hierarchies have a relatively sparse number of connections. Each individual only has relationships with his or her boss, peers, and direct reports. So the first step is to build more relationships and connections. This change first developed inside the special operations community whose leaders faced the reality of being out-paced by a new type of networked challenger in Al Qaeda, and therefore focused on building the density and diversity of their own friendly network. They orchestrated an unprecedented level of interagency collaboration across organizations that previously had never worked together — a model often referred to as the "team of teams."

(2) Establish shared purpose
To build relationships, it's not enough to hold offsites and call bigger meetings. People need a reason to work together — a reason that simultaneously addresses the interests of all stakeholders: customers, community, investors, and employees. In Iraq, the shared purpose was rebuilding a nation on principles of freedom and self-determination. As General Stan McChrystal, one of the leaders of the military's move to networks, said in a recent TED Talk: "Instead of giving orders, you're now building consensus and you're building a sense of shared purpose."

(3) Create shared consciousness
To get where you are going, you first have to know where you are. Shared consciousness ensures that everyone across the network has a sense of where they are and is acting on the best available information. The formation of "intelligence fusion teams" created unprecedented levels of collaboration between a broad array of military units and many civilian organizations, accelerating the flow of information across the network. These globally dispersed teams were constantly connected and became the epicenter for creating shared consciousness. They gathered data from across the network, then pushed out the information to whomever was best positioned to take swift and effective action.

(4) Encourage dissent
In a hierarchy, obedience is a virtue. In a network, it is a vice. Conformity creates groupthink, stifling innovation and organizational resilience. The antidote is cultural diversity in all its forms: experience, gender, age, ethnicity, geography, profession, etc. The new military-interagency collaboration created an environment in which dissent was not only tolerated, but encouraged. Instead of being chastised for expressing a contrary or unpopular view, team members were reprimanded for withholding it. Individuals were incentivized to present counterpoints, and leaders worked diligently to ensure the environment was safe for the free exchange of ideas.

This new approach turned traditional war-fighting upside down and inside out. Instead of centralizing command and control at the top, information and autonomy was aggressively pushed to decision-makers in the field. Decentralize to the edge of discomfort became the mantra of many of these organizations — setting the conditions for rapid and focused action.

Motivated by a shared purpose and aligned by shared consciousness, the network became denser, more diverse, and more intelligent. The result was unprecedented speed, resilience, and effectiveness — even while surrounded by the chaos of war. Bigger no longer meant slower, and network no longer meant unpredictable.

If the Pentagon could turn itself from a whale into a city, so too can a large corporation. Most companies embody leadership and organizational models created at the turn of the 20th century. Back then, the goal was size not speed, and the challenge was coordination not complexity. We live in a different time and we need new models to enable us to get bigger and faster. We need our leaders to be more like mayors than generals, building relationships instead of issuing orders. If our generals can make the change, so too can our business leaders.

Interesting piece of nature in relation to organizations and the speed they develop, create and innovate.

It's Beta Time again at #Autodesk: (Getting access to the 2014 software preview)

« Feature Friday: Data Preview in PLM 360 | Main | Tip Tuesday! Seeing Who to Pester in PLM 360 »

11/05/2012

Beta Time (Getting access to the 2014 software preview)

3366702450_4f3a1492a9

Its that time of the year, the first coat of paint has been applied and we are ready, waiting for your feedback before we wheel the new build out of the garage.

If you have the time and interest follow this link to the Autodesk Vault Beta Site where you can enrol, download and start to play around with the 2014 release, providing feedback to the Beta team and participate in private Beta community discussions. 

Head to the Beta site now to learn more.

-Allan

Photo: D Petzold Photography

Comments

Feed

You can follow this conversation by subscribing to the comment feed for this post.

Verify your Comment

Previewing your Comment

Posted by:  | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

This weblog only allows comments from registered TypeKey users. To comment, please enable JavaScript so you can sign into TypeKey.

For all you beta testers, cad managers (who plan on upgrading to 2014 releases next year), and general enthousiast; beta's are now open!

REading: 5 Reasons Why Your PLM Team Sucks!

http://plmjim.blogspot.nl/2012/11/5-reasons-why-your-plm-team-sucks.html 5 good points why your PLM implementation could be off track!

Monday, November 5, 2012

Logic vs Empathy: was Star Trek on the ball hinting that humans can't be empathetic and logical at the same time? [#FlowGeneration]

Brain scans find that the two modes are mutually exclusive.
By Colleen Park Posted 11.01.2012 at 3:57 pm 17 Comments

Logic Versus Empathy Anthony I. Jack, Abigail Dawson, Katelyn Begany, Regina L. Leckie, Kevin Barry, Angela Ciccia, Abraham Snyder

Logic and emotion tend to be considered as polar opposites. Think about the analytic CEO—his actions make sense in the science of profit, but when it means using cheap human labor or firing a couple hundred employees, there’s an apparent lack of concern for the human consequences of his actions. Many choices are a struggle to compromise the two systems--and that may have to do with how our brains are wired.

A new study published in NeuroImage found that separate neural pathways are used alternately for empathetic and analytic problem solving. The study compares it to a see-saw. When you’re busy empathizing, the neural network for analysis is repressed, and this switches according to the task at hand.

Anthony Jack, an assistant professor in cognitive science at Case Western Reserve University and lead author of the study, relates the idea to an optical illusion. You can see a duck or a rabbit in the image, but not both at the same time. This limitation to what you can see is called perceptual rivalry. Jack's new study takes this concept beyond visual perception, and investigates how the brain processes situations. It found separate neural networks for social/emotional processing and for logical analysis.

The study took magnetic resonance images of 45 college students as they were presented with problems involving social issues or physics. The MRIs showed that separate regions of the brain activated and deactivated according to the type of problem.

Finding a balance between the use of the two neural pathways could give insight into treatment for neuropsychiatric disorders such as depression and schizophrenia, according to Jack.

Interesting study to keep in mind when making tough decisions. You can't weigh the logical and the empathical at the same time....so logical dictates that 2 separate discussion need to be held in your mind at separate timeframes to be able to come to an optimal decision.

Being Inventive iLogic story: Control Sketched Symbol and Title Block via drawing parameters using Ilogic

« Colors missing on model after Stress Analysis | Main

11/01/2012

Control Sketched Symbol and Title Block via drawing parameters using Ilogic

In your drawing, you want to control the geometry of your title block or sketch symbols. Unfortunately drawing parameters are not directly linked with Symbol or Title blocks dimensions.

We created a short Ilogic code to help you realize this.

Overview

You will have to create User Parameters like this :

“SymbolorTitleBlockname”_”Dimensionname”

For example : I am editing the Symbol named Rectangle and I want to control its parameter d3. I will a user parameter named “Rectangle_d3”

The same applies for Title Blocks. The rule is case sensitive.

Rectangle


Parameter

 

The Ilogic rule inside the drawing will scan all your parameters. If it detects a parameter with an underscore :

  • it will look for a Title Block or a Sketched Symbol with the same name
  • it will change the dimension in the parameter name if it exists.
  • it will do a check to ensure the Block or Symbol is still editable after change.

 

The file ParameterDrawing contains the Ilogic rule and some sketched symbols with the corresponding parameters so you can get familier with this Rule. You can download it at the end of the article.

Pay attention to the units, we are not checking the parameter units are corresponding to the sketch units. Concerning Event trigger, there is no Event option for Parameter change. One solution is to use the “Before Save Document” trigger.

Thanks to Brian Ekins and Mike Deck for the great tips.

Pierre Masson.

Final


Download ParameterDrawing

Download CodeAsText

Posted at 03:49 AM in API, Drawing, iLogic/VBA | Permalink

| |

Comments

Feed

You can follow this conversation by subscribing to the comment feed for this post.

Could you perhaps just post the code (as a .txt), or a file created with an earlier version of Inventor?

I'm still on 2012.

Thanks a lot!

Posted by: MegaJerk | 11/01/2012 at 09:36 AM

Hi,

Yes you're right. I attached the code as text at the end of the article.

Posted by: Pierre Masson | 11/02/2012 at 01:00 AM

Verify your Comment

Previewing your Comment

Posted by:  | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Nice iLogic usage story.

Being Inventive iLogic story: Control Sketched Symbol and Title Block via drawing parameters using Ilogic

« Colors missing on model after Stress Analysis | Main

11/01/2012

Control Sketched Symbol and Title Block via drawing parameters using Ilogic

In your drawing, you want to control the geometry of your title block or sketch symbols. Unfortunately drawing parameters are not directly linked with Symbol or Title blocks dimensions.

We created a short Ilogic code to help you realize this.

Overview

You will have to create User Parameters like this :

“SymbolorTitleBlockname”_”Dimensionname”

For example : I am editing the Symbol named Rectangle and I want to control its parameter d3. I will a user parameter named “Rectangle_d3”

The same applies for Title Blocks. The rule is case sensitive.

Rectangle


Parameter

 

The Ilogic rule inside the drawing will scan all your parameters. If it detects a parameter with an underscore :

  • it will look for a Title Block or a Sketched Symbol with the same name
  • it will change the dimension in the parameter name if it exists.
  • it will do a check to ensure the Block or Symbol is still editable after change.

 

The file ParameterDrawing contains the Ilogic rule and some sketched symbols with the corresponding parameters so you can get familier with this Rule. You can download it at the end of the article.

Pay attention to the units, we are not checking the parameter units are corresponding to the sketch units. Concerning Event trigger, there is no Event option for Parameter change. One solution is to use the “Before Save Document” trigger.

Thanks to Brian Ekins and Mike Deck for the great tips.

Pierre Masson.

Final


Download ParameterDrawing

Download CodeAsText

Posted at 03:49 AM in API, Drawing, iLogic/VBA | Permalink

| |

Comments

Feed

You can follow this conversation by subscribing to the comment feed for this post.

Could you perhaps just post the code (as a .txt), or a file created with an earlier version of Inventor?

I'm still on 2012.

Thanks a lot!

Posted by: MegaJerk | 11/01/2012 at 09:36 AM

Hi,

Yes you're right. I attached the code as text at the end of the article.

Posted by: Pierre Masson | 11/02/2012 at 01:00 AM

Verify your Comment

Previewing your Comment

Posted by:  | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Nice iLogic usage story.

Inventor service packs available for 2013 versions

Service packs for Inventor 2013 and the standalone version of Autodesk® Inventor® View 2013 have recently been released. If you have an application that uses Apprentice on systems without having Inventor installed be sure to have your users get the service pack for the standalone version of Inventor View. Several issues have been addressed that effected Apprentice.

Here is the link to the page where Service Pack 1.1 for the standalone version of Autodesk® Inventor® View 2013 can be downloaded:

http://usa.autodesk.com/adsk/servlet/pc/item?siteID=123112&id=20462274

Here is the link to the page where you can download the Inventor 2013 Service Pack 1.1:

http://usa.autodesk.com/adsk/servlet/ps/dl/item?siteID=123112&id=20442029&linkID=9242019

-Wayne

Service pack 1.1 available now!

Reading: Feature Friday: Data Preview in PLM 360

8031897271_9c63e48a29

A new feature rolled out in the last release of PLM 360 is the Item Preview Data Card.  It’s a great way to not only preview item data, but even perform workflow actions right from the card (saving picks and clicks – always a good thing!).

Wanna see how easy it easy to tackle your outstanding work right from the home page?  Log in and look at your to-do list. 

datacard

When you click on ‘Preview’ for the item you want to crank out, it’ll bring up the data card with all essential information.

datacard2

If you’re ready to transition it to it’s next state, just click on “Show Map” and you can use the new interactive workflow map to promote right from there (more about the new map next week, have  to leave you guys wanting to come back for more feature info):

datacard3

You can also easily browser through the records with simple Next/Previous controls (the right and left arrow keys on the keyboard work too).

datacard4

You can learn more about the Item Preview Data Card on the Wiki in the ‘Getting to Know the UI’ section.  As for Data, check your local listings for “Star Trek: The Next Generation” reruns.  ^MS

Photo: JD Hancock

Preview data card addition to #AutodeskPLM360.

Reading: PLM Innovation 2012 US

A week later after the PLM Innovation conference in the US, I have time to write down my impressions. It was the first time this event was organized in the US, after having successful events the past years in Europe. For me it was a pleasure to meet some of my PLM friends in reality as most of my activities are in Europe.

With an audience of approximate 300 people, there were a lot of interesting sessions. Some of them in parallel, but as all session are recorded I will soon catch up with the sessions I have been missing.

My overall impression of the event: Loud en Positive, which is perhaps a typical difference between the US and Old Europe.

Here some impressions from sessions that caught my attention

image

Kevin Fowler, Chief Architect Commercial Airplanes Processes and Tools from The Boeing Company presented the PLM journey BCA went through. Their evolution path is very similar to the way Siemens and Dassault Systemes went through (driven by Boeing’s challenges).

Impressive was the amount of parts that need to be managed aircraft (up to a billion) and with that all its related information. Interesting to see that the amount of parts for the 787 have strongly decreased.

After PLM Generation 1 based on Teamcenter and Generation 2 based on Dassault Kevin demonstrated that functionality and cost of ownership increased due to more complexity, it was evident that usability decreased.

And this will be a serious point of attention for Generation 3, the PLM system BCA will be selecting for 2015 and beyond. Usability has to increase.

And as we were among all the PLM vendors and customers, during the breaks there was a discussion, which PLM vendor would be the preferred next partner for PLM.  I had a discussion related to PLM vision and visibility with one of the SAP partners (DSC software Inc.). He is convinced that SAP provides one of the best PLM platforms. I am not convinced as I see SAP still as a company that wants to do everything, starting from ERP. And as long as their management and websites do not reflect a PLM spirit I am not convinced. In 2015 I might be wrong with my impression that PLM, Usability and SAP are not connected.

Note: browse to this SAP PLM rapid-deployment solution page and view the Step by Step guide. Now the heading becomes SAP CRM rapid-deployment solution. A missing link, marketing or do they know the difference between PLM and CRM ?

image

Next Nathan Hartman from Purdue University described his view on future PLM which will be model-based and he presented how PLM tools could work together describing a generic architecture and interfaces. This is somehow the way the big PLM Vendors are describing their platform too, only in their situation more in a proprietary environment.

  • Nathan gave an interesting anecdote related to data sharing. He mentioned as example a 3D model that was built by one student and he asked another student to make modifications on it. This was already a challenge and even working with the same software lead to knowledge issues, trying to understand the way the model was built. Demonstrating  PLM data sharing is not only about having the right format and application, but also the underlying knowledge needs to be exposed

image

Monica Schnitger, as business analyst presented her thoughts on PLM justification. Where in Munich I presented Making the case for PLM session,  Monica focused on a set of basic questions that you need to ask (as a company) and how you can justify a PLM investment. It is not for the big kids anymore and you can find her presentation here (with another PLM definition).

I liked the approach of keeping things simple, as sometimes people make PLM too complex. (Also as it serves their own businesses). Monica presented that a company should define its own reasons for why and how PLM. Here I have a slight different approach. Often mid-market companies do not want PLM, they have pains they want to get rid of or problems that they want to solve. Often starting from the pain and with guidance from a consultant companies will understand which PLM practices they could use and how it fits in a bigger picture instead of using plasters to fix the pain.

image

Beth Lange, Chief Scientific Officer from Mary Kay presented how her organization, operating from the US (Texas), managed a portfolio of skin care products sold around the world by an independent local sales force all around the world. In order to do this successfully and meet all the local regulatory requirements, they implemented a PLM system where a central repository of global information is managed.

The challenge for Mary Kay is that from origin a company with a focus on skin care products and an indirect sales force, where sometimes the sales person has no IT skills, this project was also a big cultural change. Beth explained that indeed the support from Kalypso was crucial to manage the change. Something which I believe is always crucial in a global PLM project where the ideal implementation is so different from current, mainly isolated practices.

As regulatory compliance is an important topic for skin care products, Beth explained that due to the compliancy rules for China, where they have to expose their whole IP, the only way to protect their IP was putting a patent on everything, even on changes.

Would NPI mean New Patent Introduction in the CPG market ?

image

Ron Watson, Director, PLM COE and IT Architecture
from Xylem Inc. presented their global PLM approach. As the company is is relative young (2011) but is a collection of businesses all around the world, they have the challenge to operate as a single company and sharing the synergy.

Ron introduced PDLM (Product Data Lifecycle Management) and he explained there was first a focus getting all data under control and make it the single source for all product data in a digital format, preferably with a minimum of translation needed.

Here you see xylem has chosen for an integrated platform and not the best of breed applications. After having the product data under control the focus can be on standardizing processed overall the company. Something which other companies that have followed this approach, confirm it brings huge benefits.

As it was a PTC case study, Graham Birch, senior director of Product Management from PTC did the closing part. Unfortunate by demoing some pieces of the software. A pity as I believe people do not get impressed by seeing some data on the screen they recognize. Only when there is a new paradigm to demonstrate related to usability I would be interested.

image

And as-if they have read my mind, Daniel Armour from Joy Global demonstrated the value and attractiveness of 3D Visualization tools in their organization. Joy Global is manufacturer of some of the biggest mining equipment and he demonstrated how 3D Visualization can be used in the sales and marketing process, but also during training and analysis of work scenarios.

His demonstration showed again that 3D as a communication layer is attractive and appeals to the user (serious gaming in some cases).

As it was a SAP case, I was surprised to hear the words from Brian Soaper, explaining the power of 3D for SAP users and how SAP users will benefit from better understanding, higher usability etc. Iw as as-if a 3D-CAD/PLM was talking, was this a dream ?

I woke up out of this dream when someone from the audience asked to Daniel how they would keep the visualizations actual, is there a kind of version management ?  Daniel mentioned currently not but you could build a database to perform check-in/checkout of data. Apparently all the 3D we have seen is not connected to this single database SAP always promotes.

image

Peter Bilello, CIMdata’s president had a closing session with the title: Evaluate the tangible benefits from PLM can prove complex, which indeed is true.  Peter’s presentation was partly similar to the presentation he gave early this year in Munich. And this is what I appreciate about CIMdata. Some people in the audience mentioned that many times it is the same story and many of the issues Peter was presenting are somehow known facts. And this is what I like about CIMdata, PLM is not changing per conference or new IT-hype. If you want to understand PLM, you need to keep to the purpose and meaning of PLM. And these known facts apparently are not so known, a majority of PLM projects are executed or lead by people that decided to invent the wheel,as inventing the wheel seems cheaper than renting a wheel, and this lead again to issues later that every experienced consultant could foresee.

image

The evening with a champagne reception on the paddle boat making a tour around the lake and a dinner at the lake side concluded this first day.
The combination of presentations, scheduled network meetings and enough network time made it a successful first day

observation

Next day I started with a BOM management Think Thank were in the target was to come to some common practices and understanding of BOM management.  As the amount of participants was large and the time was short we only had a chance to touch the surface of the cases brought in.

What was clear from this session to me is that most challenges reported were due to the fact that the tools were already in place and only afterwards the PLM team (mostly engineering people) had to struggle to make it into a consistent process. They do not get a real help from PLM vendors or implementers, as their focus is on selling more tools and services.

What is missing for these organizations is a PLM for Dummies methodology guide, which is business centric instead of technology centric. For sure there are people who have published PLM books, but either they are not found of relevant. And as nothing comes for free, these companies try to invent the wheel again. PLM is serious business.

image

The first keynote speech from the second day was from Dantar Oosterwal, Partner and President of the Milwaukee Consulting Group, who inspired us with Lean and PLM: Operation Excellence and this all related to his experiences with Harley Davidson.

It was interesting how described the process of focusing on the throughput to get market results. There are various parameters how you can influence market share, by a price strategy, by increased marketing , but the most impact on Harley Davidson sales result was the effect of innovation. More model variants being the choice for more potential customers. By measuring and analyzing the throughput of the organization an optimal tuning could be found.

Dantar also shared an interesting anecdote about an engineer that had to study the impact of ethanol as fuel for a certain engine. And after a certain time the engineer came back with the answer: yes we can. He answered the question but left no knowledge behind. Where a similar question was asked about performance to a supplier and he came back with an answer and graphs explaining where the answered was based upon. This answer created knowledge as it could be reused for similar questions. It is a good example how companies should focus on collecting knowledge in their PLM environment instead of answers on a question.

image

The second keynote speaker was from the world biggest brand, Christopher Boudard, PLM Director from the Coca Cola Company. With its multiple brands and global operations it is a challenge to work towards a single PLM platform. He explained that at this stage they are still busy loading data into the system, where a lot of time is spent on data cleansing as the system has only value when the data is clean and accurate.

And this requires a lot of motivation for the PLM team to keep the executive management involved and sponsoring a project that takes five years to consolidate data and only then through the right processes make sure the data remains correct.

Christopher demonstrated in a passionate manner that leadership is crucial for such a project to be successful and implemented. For me as an European it was interesting to see that the world biggest brand PLM Director is a French citizen inspiring the management of such an American company.

image

Monica Schnitger conducted an interesting session about the state-of-the-state of multi-platform PLM.
If you cannot understand this tittle, it was a debate between the PLM vendors ( Aras, Autodesk, Dassault Systemes, PTC, SAP and Siemens) about openness, interoperability, cloud and open source.

After the first question from Monica about the openness of each of the vendor’s systems, it was clear there are no problems to expect in the future. All systems were extremely open according to the respondents and I lost my attention for the debate somehow as I had the feeling I was listening to an election debate. Monica did her best to make it an unbiased discussion, however I feel when some people want to make a specific point and use every question to jump on that it becomes an irritation.

image

Chad Jackson, this time dressed up as the guy that is always killed in the first 5 minutes of a Star Trek episode, shared with us the early findings of the 2012 State of PLM. Tech4PD followers, and who is not a follower, understood he lost the bet of the second episode.

Chad let me know if this picture needs to be removed, as it can kill your future career.

The preliminary findings Chad was sharing with us that manufacturing and service where significant interested and consumers of PLM data, but do not consider it as their data, where they have to contribute too also. The fact that it is available makes them involved in using the data, still these departments do not show active participating in PLM. Somehow this confirms the observation that PLM is still considered as an engineering tool, not as an enterprise wide platform.

As the initial group of participants (n = 100) is small and not random selected from an overall population, the questions remains what the state of PLM is in 2012. I assume Chad will come back on that in a later time.

The last plenary sessions, David Karamian from Flextronics and Michael Grieves, a Virtual Perfect Future, had the ungrateful position being the last two speakers of this event. I have to review  David’s presentation again as it was not easy to digest and recall a week later what were his highlights. Michael’s presentation was easier to digest and I also believe with the new upcoming concepts and technology the virtual perfect future is there.

Looking back on a successful event, where I met many of my PLM peers from across the ocean, I will take the upcoming weeks to review the sessions I missed. Final good news for all PLM mind sharers, is the fact that CIMdata and MarketKey announced the coordination of their upcoming events next year – more content and more attendees guaranteed.

Rate this:

Share this:

Nice recap by #Virtualdutchman on several speakers @ the PLM innovation 2012 event.